Air Miles Calculator logo

How far is Samjiyon from Lüliang?

The distance between Lüliang (Lüliang Dawu Airport) and Samjiyon (Samjiyon Airport) is 962 miles / 1548 kilometers / 836 nautical miles.

The driving distance from Lüliang (LLV) to Samjiyon (YJS) is 1182 miles / 1902 kilometers, and travel time by car is about 22 hours 29 minutes.

Lüliang Dawu Airport – Samjiyon Airport

Distance arrow
962
Miles
Distance arrow
1548
Kilometers
Distance arrow
836
Nautical miles

Search flights

Distance from Lüliang to Samjiyon

There are several ways to calculate the distance from Lüliang to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 962.062 miles
  • 1548.289 kilometers
  • 836.009 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 960.020 miles
  • 1545.002 kilometers
  • 834.235 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lüliang to Samjiyon?

The estimated flight time from Lüliang Dawu Airport to Samjiyon Airport is 2 hours and 19 minutes.

What is the time difference between Lüliang and Samjiyon?

There is no time difference between Lüliang and Samjiyon.

Flight carbon footprint between Lüliang Dawu Airport (LLV) and Samjiyon Airport (YJS)

On average, flying from Lüliang to Samjiyon generates about 148 kg of CO2 per passenger, and 148 kilograms equals 327 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lüliang to Samjiyon

See the map of the shortest flight path between Lüliang Dawu Airport (LLV) and Samjiyon Airport (YJS).

Airport information

Origin Lüliang Dawu Airport
City: Lüliang
Country: China Flag of China
IATA Code: LLV
ICAO Code: ZBLL
Coordinates: 37°40′59″N, 111°8′34″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E