Air Miles Calculator logo

How far is Meixian from Xingyi?

The distance between Xingyi (Xingyi Wanfenglin Airport) and Meixian (Meixian Airport) is 704 miles / 1133 kilometers / 612 nautical miles.

The driving distance from Xingyi (ACX) to Meixian (MXZ) is 903 miles / 1454 kilometers, and travel time by car is about 16 hours 22 minutes.

Xingyi Wanfenglin Airport – Meixian Airport

Distance arrow
704
Miles
Distance arrow
1133
Kilometers
Distance arrow
612
Nautical miles

Search flights

Distance from Xingyi to Meixian

There are several ways to calculate the distance from Xingyi to Meixian. Here are two standard methods:

Vincenty's formula (applied above)
  • 704.103 miles
  • 1133.144 kilometers
  • 611.849 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 702.924 miles
  • 1131.247 kilometers
  • 610.825 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xingyi to Meixian?

The estimated flight time from Xingyi Wanfenglin Airport to Meixian Airport is 1 hour and 49 minutes.

What is the time difference between Xingyi and Meixian?

There is no time difference between Xingyi and Meixian.

Flight carbon footprint between Xingyi Wanfenglin Airport (ACX) and Meixian Airport (MXZ)

On average, flying from Xingyi to Meixian generates about 125 kg of CO2 per passenger, and 125 kilograms equals 276 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xingyi to Meixian

See the map of the shortest flight path between Xingyi Wanfenglin Airport (ACX) and Meixian Airport (MXZ).

Airport information

Origin Xingyi Wanfenglin Airport
City: Xingyi
Country: China Flag of China
IATA Code: ACX
ICAO Code: ZUYI
Coordinates: 25°5′11″N, 104°57′33″E
Destination Meixian Airport
City: Meixian
Country: China Flag of China
IATA Code: MXZ
ICAO Code: ZGMX
Coordinates: 24°21′0″N, 116°7′58″E