Air Miles Calculator logo

How far is Samjiyon from Nantong?

The distance between Nantong (Nantong Xingdong Airport) and Samjiyon (Samjiyon Airport) is 792 miles / 1275 kilometers / 689 nautical miles.

The driving distance from Nantong (NTG) to Samjiyon (YJS) is 1334 miles / 2147 kilometers, and travel time by car is about 25 hours 13 minutes.

Nantong Xingdong Airport – Samjiyon Airport

Distance arrow
792
Miles
Distance arrow
1275
Kilometers
Distance arrow
689
Nautical miles

Search flights

Distance from Nantong to Samjiyon

There are several ways to calculate the distance from Nantong to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 792.453 miles
  • 1275.329 kilometers
  • 688.623 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 793.096 miles
  • 1276.364 kilometers
  • 689.182 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nantong to Samjiyon?

The estimated flight time from Nantong Xingdong Airport to Samjiyon Airport is 2 hours and 0 minutes.

What is the time difference between Nantong and Samjiyon?

There is no time difference between Nantong and Samjiyon.

Flight carbon footprint between Nantong Xingdong Airport (NTG) and Samjiyon Airport (YJS)

On average, flying from Nantong to Samjiyon generates about 134 kg of CO2 per passenger, and 134 kilograms equals 296 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nantong to Samjiyon

See the map of the shortest flight path between Nantong Xingdong Airport (NTG) and Samjiyon Airport (YJS).

Airport information

Origin Nantong Xingdong Airport
City: Nantong
Country: China Flag of China
IATA Code: NTG
ICAO Code: ZSNT
Coordinates: 32°4′14″N, 120°58′33″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E