Air Miles Calculator logo

How far is Samjiyon from Liupanshui?

The distance between Liupanshui (Liupanshui Yuezhao Airport) and Samjiyon (Samjiyon Airport) is 1696 miles / 2729 kilometers / 1474 nautical miles.

The driving distance from Liupanshui (LPF) to Samjiyon (YJS) is 2151 miles / 3461 kilometers, and travel time by car is about 40 hours 10 minutes.

Liupanshui Yuezhao Airport – Samjiyon Airport

Distance arrow
1696
Miles
Distance arrow
2729
Kilometers
Distance arrow
1474
Nautical miles

Search flights

Distance from Liupanshui to Samjiyon

There are several ways to calculate the distance from Liupanshui to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1695.972 miles
  • 2729.402 kilometers
  • 1473.759 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1695.285 miles
  • 2728.298 kilometers
  • 1473.163 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Liupanshui to Samjiyon?

The estimated flight time from Liupanshui Yuezhao Airport to Samjiyon Airport is 3 hours and 42 minutes.

What is the time difference between Liupanshui and Samjiyon?

There is no time difference between Liupanshui and Samjiyon.

Flight carbon footprint between Liupanshui Yuezhao Airport (LPF) and Samjiyon Airport (YJS)

On average, flying from Liupanshui to Samjiyon generates about 192 kg of CO2 per passenger, and 192 kilograms equals 424 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Liupanshui to Samjiyon

See the map of the shortest flight path between Liupanshui Yuezhao Airport (LPF) and Samjiyon Airport (YJS).

Airport information

Origin Liupanshui Yuezhao Airport
City: Liupanshui
Country: China Flag of China
IATA Code: LPF
ICAO Code: ZUPS
Coordinates: 26°36′33″N, 104°58′44″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E