Air Miles Calculator logo

How far is Samjiyon from Yushu?

The distance between Yushu (Yushu Batang Airport) and Samjiyon (Samjiyon Airport) is 1824 miles / 2936 kilometers / 1585 nautical miles.

The driving distance from Yushu (YUS) to Samjiyon (YJS) is 2294 miles / 3692 kilometers, and travel time by car is about 42 hours 59 minutes.

Yushu Batang Airport – Samjiyon Airport

Distance arrow
1824
Miles
Distance arrow
2936
Kilometers
Distance arrow
1585
Nautical miles

Search flights

Distance from Yushu to Samjiyon

There are several ways to calculate the distance from Yushu to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1824.261 miles
  • 2935.863 kilometers
  • 1585.239 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1820.891 miles
  • 2930.439 kilometers
  • 1582.311 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yushu to Samjiyon?

The estimated flight time from Yushu Batang Airport to Samjiyon Airport is 3 hours and 57 minutes.

What is the time difference between Yushu and Samjiyon?

There is no time difference between Yushu and Samjiyon.

Flight carbon footprint between Yushu Batang Airport (YUS) and Samjiyon Airport (YJS)

On average, flying from Yushu to Samjiyon generates about 202 kg of CO2 per passenger, and 202 kilograms equals 445 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yushu to Samjiyon

See the map of the shortest flight path between Yushu Batang Airport (YUS) and Samjiyon Airport (YJS).

Airport information

Origin Yushu Batang Airport
City: Yushu
Country: China Flag of China
IATA Code: YUS
ICAO Code: ZYLS
Coordinates: 32°50′11″N, 97°2′11″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E