Air Miles Calculator logo

How far is Samjiyon from Xinyuan County?

The distance between Xinyuan County (Xinyuan Nalati Airport) and Samjiyon (Samjiyon Airport) is 2268 miles / 3650 kilometers / 1971 nautical miles.

The driving distance from Xinyuan County (NLT) to Samjiyon (YJS) is 2803 miles / 4511 kilometers, and travel time by car is about 51 hours 48 minutes.

Xinyuan Nalati Airport – Samjiyon Airport

Distance arrow
2268
Miles
Distance arrow
3650
Kilometers
Distance arrow
1971
Nautical miles

Search flights

Distance from Xinyuan County to Samjiyon

There are several ways to calculate the distance from Xinyuan County to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 2268.052 miles
  • 3650.077 kilometers
  • 1970.884 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2262.053 miles
  • 3640.421 kilometers
  • 1965.670 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xinyuan County to Samjiyon?

The estimated flight time from Xinyuan Nalati Airport to Samjiyon Airport is 4 hours and 47 minutes.

Flight carbon footprint between Xinyuan Nalati Airport (NLT) and Samjiyon Airport (YJS)

On average, flying from Xinyuan County to Samjiyon generates about 248 kg of CO2 per passenger, and 248 kilograms equals 547 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xinyuan County to Samjiyon

See the map of the shortest flight path between Xinyuan Nalati Airport (NLT) and Samjiyon Airport (YJS).

Airport information

Origin Xinyuan Nalati Airport
City: Xinyuan County
Country: China Flag of China
IATA Code: NLT
ICAO Code: ZWNL
Coordinates: 43°25′54″N, 83°22′42″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E