Air Miles Calculator logo

How far is Lijiang from Hami?

The distance between Hami (Hami Airport) and Lijiang (Lijiang Sanyi International Airport) is 1174 miles / 1890 kilometers / 1020 nautical miles.

The driving distance from Hami (HMI) to Lijiang (LJG) is 1829 miles / 2944 kilometers, and travel time by car is about 33 hours 27 minutes.

Hami Airport – Lijiang Sanyi International Airport

Distance arrow
1174
Miles
Distance arrow
1890
Kilometers
Distance arrow
1020
Nautical miles

Search flights

Distance from Hami to Lijiang

There are several ways to calculate the distance from Hami to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1174.280 miles
  • 1889.820 kilometers
  • 1020.421 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1176.465 miles
  • 1893.337 kilometers
  • 1022.320 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hami to Lijiang?

The estimated flight time from Hami Airport to Lijiang Sanyi International Airport is 2 hours and 43 minutes.

What is the time difference between Hami and Lijiang?

There is no time difference between Hami and Lijiang.

Flight carbon footprint between Hami Airport (HMI) and Lijiang Sanyi International Airport (LJG)

On average, flying from Hami to Lijiang generates about 160 kg of CO2 per passenger, and 160 kilograms equals 353 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hami to Lijiang

See the map of the shortest flight path between Hami Airport (HMI) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Hami Airport
City: Hami
Country: China Flag of China
IATA Code: HMI
ICAO Code: ZWHM
Coordinates: 42°50′29″N, 93°40′9″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E