Air Miles Calculator logo

How far is Samjiyon from Xi'an?

The distance between Xi'an (Xi'an Xianyang International Airport) and Samjiyon (Samjiyon Airport) is 1184 miles / 1905 kilometers / 1029 nautical miles.

The driving distance from Xi'an (XIY) to Samjiyon (YJS) is 1441 miles / 2319 kilometers, and travel time by car is about 27 hours 17 minutes.

Xi'an Xianyang International Airport – Samjiyon Airport

Distance arrow
1184
Miles
Distance arrow
1905
Kilometers
Distance arrow
1029
Nautical miles

Search flights

Distance from Xi'an to Samjiyon

There are several ways to calculate the distance from Xi'an to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1183.756 miles
  • 1905.070 kilometers
  • 1028.656 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1181.852 miles
  • 1902.007 kilometers
  • 1027.002 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Xi'an to Samjiyon?

The estimated flight time from Xi'an Xianyang International Airport to Samjiyon Airport is 2 hours and 44 minutes.

What is the time difference between Xi'an and Samjiyon?

There is no time difference between Xi'an and Samjiyon.

Flight carbon footprint between Xi'an Xianyang International Airport (XIY) and Samjiyon Airport (YJS)

On average, flying from Xi'an to Samjiyon generates about 161 kg of CO2 per passenger, and 161 kilograms equals 354 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Xi'an to Samjiyon

See the map of the shortest flight path between Xi'an Xianyang International Airport (XIY) and Samjiyon Airport (YJS).

Airport information

Origin Xi'an Xianyang International Airport
City: Xi'an
Country: China Flag of China
IATA Code: XIY
ICAO Code: ZLXY
Coordinates: 34°26′49″N, 108°45′7″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E