Air Miles Calculator logo

How far is Samjiyon from Meixian?

The distance between Meixian (Meixian Airport) and Samjiyon (Samjiyon Airport) is 1400 miles / 2254 kilometers / 1217 nautical miles.

The driving distance from Meixian (MXZ) to Samjiyon (YJS) is 1927 miles / 3102 kilometers, and travel time by car is about 36 hours 5 minutes.

Meixian Airport – Samjiyon Airport

Distance arrow
1400
Miles
Distance arrow
2254
Kilometers
Distance arrow
1217
Nautical miles

Search flights

Distance from Meixian to Samjiyon

There are several ways to calculate the distance from Meixian to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1400.403 miles
  • 2253.731 kilometers
  • 1216.917 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1402.341 miles
  • 2256.849 kilometers
  • 1218.601 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Meixian to Samjiyon?

The estimated flight time from Meixian Airport to Samjiyon Airport is 3 hours and 9 minutes.

What is the time difference between Meixian and Samjiyon?

There is no time difference between Meixian and Samjiyon.

Flight carbon footprint between Meixian Airport (MXZ) and Samjiyon Airport (YJS)

On average, flying from Meixian to Samjiyon generates about 173 kg of CO2 per passenger, and 173 kilograms equals 382 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Meixian to Samjiyon

See the map of the shortest flight path between Meixian Airport (MXZ) and Samjiyon Airport (YJS).

Airport information

Origin Meixian Airport
City: Meixian
Country: China Flag of China
IATA Code: MXZ
ICAO Code: ZGMX
Coordinates: 24°21′0″N, 116°7′58″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E