Air Miles Calculator logo

How far is Ji'an from Meixian?

The distance between Meixian (Meixian Airport) and Ji'an (Jinggangshan Airport) is 193 miles / 311 kilometers / 168 nautical miles.

The driving distance from Meixian (MXZ) to Ji'an (JGS) is 283 miles / 455 kilometers, and travel time by car is about 5 hours 27 minutes.

Meixian Airport – Jinggangshan Airport

Distance arrow
193
Miles
Distance arrow
311
Kilometers
Distance arrow
168
Nautical miles

Search flights

Distance from Meixian to Ji'an

There are several ways to calculate the distance from Meixian to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 193.313 miles
  • 311.106 kilometers
  • 167.984 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 193.819 miles
  • 311.921 kilometers
  • 168.424 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Meixian to Ji'an?

The estimated flight time from Meixian Airport to Jinggangshan Airport is 51 minutes.

What is the time difference between Meixian and Ji'an?

There is no time difference between Meixian and Ji'an.

Flight carbon footprint between Meixian Airport (MXZ) and Jinggangshan Airport (JGS)

On average, flying from Meixian to Ji'an generates about 53 kg of CO2 per passenger, and 53 kilograms equals 118 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Meixian to Ji'an

See the map of the shortest flight path between Meixian Airport (MXZ) and Jinggangshan Airport (JGS).

Airport information

Origin Meixian Airport
City: Meixian
Country: China Flag of China
IATA Code: MXZ
ICAO Code: ZGMX
Coordinates: 24°21′0″N, 116°7′58″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E