Air Miles Calculator logo

How far is Meixian from Lijiang?

The distance between Lijiang (Lijiang Sanyi International Airport) and Meixian (Meixian Airport) is 1005 miles / 1617 kilometers / 873 nautical miles.

The driving distance from Lijiang (LJG) to Meixian (MXZ) is 1363 miles / 2194 kilometers, and travel time by car is about 24 hours 43 minutes.

Lijiang Sanyi International Airport – Meixian Airport

Distance arrow
1005
Miles
Distance arrow
1617
Kilometers
Distance arrow
873
Nautical miles

Search flights

Distance from Lijiang to Meixian

There are several ways to calculate the distance from Lijiang to Meixian. Here are two standard methods:

Vincenty's formula (applied above)
  • 1004.540 miles
  • 1616.651 kilometers
  • 872.922 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1002.935 miles
  • 1614.067 kilometers
  • 871.526 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lijiang to Meixian?

The estimated flight time from Lijiang Sanyi International Airport to Meixian Airport is 2 hours and 24 minutes.

What is the time difference between Lijiang and Meixian?

There is no time difference between Lijiang and Meixian.

Flight carbon footprint between Lijiang Sanyi International Airport (LJG) and Meixian Airport (MXZ)

On average, flying from Lijiang to Meixian generates about 151 kg of CO2 per passenger, and 151 kilograms equals 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lijiang to Meixian

See the map of the shortest flight path between Lijiang Sanyi International Airport (LJG) and Meixian Airport (MXZ).

Airport information

Origin Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E
Destination Meixian Airport
City: Meixian
Country: China Flag of China
IATA Code: MXZ
ICAO Code: ZGMX
Coordinates: 24°21′0″N, 116°7′58″E