Air Miles Calculator logo

How far is Lijiang from Kuqa?

The distance between Kuqa (Kuqa Qiuci Airport) and Lijiang (Lijiang Sanyi International Airport) is 1427 miles / 2296 kilometers / 1240 nautical miles.

The driving distance from Kuqa (KCA) to Lijiang (LJG) is 2042 miles / 3287 kilometers, and travel time by car is about 41 hours 30 minutes.

Kuqa Qiuci Airport – Lijiang Sanyi International Airport

Distance arrow
1427
Miles
Distance arrow
2296
Kilometers
Distance arrow
1240
Nautical miles

Search flights

Distance from Kuqa to Lijiang

There are several ways to calculate the distance from Kuqa to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1426.798 miles
  • 2296.209 kilometers
  • 1239.853 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1427.151 miles
  • 2296.777 kilometers
  • 1240.160 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kuqa to Lijiang?

The estimated flight time from Kuqa Qiuci Airport to Lijiang Sanyi International Airport is 3 hours and 12 minutes.

Flight carbon footprint between Kuqa Qiuci Airport (KCA) and Lijiang Sanyi International Airport (LJG)

On average, flying from Kuqa to Lijiang generates about 175 kg of CO2 per passenger, and 175 kilograms equals 386 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kuqa to Lijiang

See the map of the shortest flight path between Kuqa Qiuci Airport (KCA) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Kuqa Qiuci Airport
City: Kuqa
Country: China Flag of China
IATA Code: KCA
ICAO Code: ZWKC
Coordinates: 41°43′5″N, 82°59′12″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E