Air Miles Calculator logo

How far is Liupanshui from Kashgar?

The distance between Kashgar (Kashgar Airport) and Liupanshui (Liupanshui Yuezhao Airport) is 1890 miles / 3042 kilometers / 1642 nautical miles.

The driving distance from Kashgar (KHG) to Liupanshui (LPF) is 2538 miles / 4085 kilometers, and travel time by car is about 48 hours 38 minutes.

Kashgar Airport – Liupanshui Yuezhao Airport

Distance arrow
1890
Miles
Distance arrow
3042
Kilometers
Distance arrow
1642
Nautical miles

Search flights

Distance from Kashgar to Liupanshui

There are several ways to calculate the distance from Kashgar to Liupanshui. Here are two standard methods:

Vincenty's formula (applied above)
  • 1889.929 miles
  • 3041.545 kilometers
  • 1642.303 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1887.919 miles
  • 3038.311 kilometers
  • 1640.557 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kashgar to Liupanshui?

The estimated flight time from Kashgar Airport to Liupanshui Yuezhao Airport is 4 hours and 4 minutes.

Flight carbon footprint between Kashgar Airport (KHG) and Liupanshui Yuezhao Airport (LPF)

On average, flying from Kashgar to Liupanshui generates about 207 kg of CO2 per passenger, and 207 kilograms equals 457 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kashgar to Liupanshui

See the map of the shortest flight path between Kashgar Airport (KHG) and Liupanshui Yuezhao Airport (LPF).

Airport information

Origin Kashgar Airport
City: Kashgar
Country: China Flag of China
IATA Code: KHG
ICAO Code: ZWSH
Coordinates: 39°32′34″N, 76°1′11″E
Destination Liupanshui Yuezhao Airport
City: Liupanshui
Country: China Flag of China
IATA Code: LPF
ICAO Code: ZUPS
Coordinates: 26°36′33″N, 104°58′44″E