Air Miles Calculator logo

Distance between Tianshui (THQ) and Lijiang (LJG)

Flight distance from Tianshui to Lijiang (Tianshui Maijishan Airport – Lijiang Sanyi International Airport) is 637 miles / 1026 kilometers / 554 nautical miles. Estimated flight time is 1 hour 42 minutes.

Driving distance from Tianshui (THQ) to Lijiang (LJG) is 917 miles / 1475 kilometers and travel time by car is about 18 hours 22 minutes.

Tianshui – Lijiang

Distance arrow
637
Miles
Distance arrow
1026
Kilometers
Distance arrow
554
Nautical miles

How far is Lijiang from Tianshui?

There are several ways to calculate distances between Los Angeles and Chicago. Here are two common methods:

Vincenty's formula (applied above)
  • 637.321 miles
  • 1025.669 kilometers
  • 553.817 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
  • 638.357 miles
  • 1027.336 kilometers
  • 554.717 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

Flight Duration

Estimated flight time from Tianshui Maijishan Airport to Lijiang Sanyi International Airport is 1 hour 42 minutes.

Time difference

There is no time difference between Tianshui and Lijiang.

Carbon dioxide emissions

On average flying from Tianshui to Lijiang generates about 117 kg of CO2 per passenger, 117 kilograms is equal to 259 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tianshui to Lijiang

Shortest flight path between Tianshui Maijishan Airport (THQ) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Tianshui Maijishan Airport
City: Tianshui
Country: China Flag of China
IATA Code: THQ
ICAO Code: ZLTS
Coordinates: 34°33′33″N, 105°51′36″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E