Air Miles Calculator logo

How far is Samjiyon from Qinhuangdao?

The distance between Qinhuangdao (Qinhuangdao Beidaihe Airport) and Samjiyon (Samjiyon Airport) is 514 miles / 827 kilometers / 447 nautical miles.

The driving distance from Qinhuangdao (BPE) to Samjiyon (YJS) is 657 miles / 1057 kilometers, and travel time by car is about 13 hours 2 minutes.

Qinhuangdao Beidaihe Airport – Samjiyon Airport

Distance arrow
514
Miles
Distance arrow
827
Kilometers
Distance arrow
447
Nautical miles

Search flights

Distance from Qinhuangdao to Samjiyon

There are several ways to calculate the distance from Qinhuangdao to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 513.909 miles
  • 827.057 kilometers
  • 446.575 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 512.781 miles
  • 825.241 kilometers
  • 445.595 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Qinhuangdao to Samjiyon?

The estimated flight time from Qinhuangdao Beidaihe Airport to Samjiyon Airport is 1 hour and 28 minutes.

What is the time difference between Qinhuangdao and Samjiyon?

There is no time difference between Qinhuangdao and Samjiyon.

Flight carbon footprint between Qinhuangdao Beidaihe Airport (BPE) and Samjiyon Airport (YJS)

On average, flying from Qinhuangdao to Samjiyon generates about 101 kg of CO2 per passenger, and 101 kilograms equals 222 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Qinhuangdao to Samjiyon

See the map of the shortest flight path between Qinhuangdao Beidaihe Airport (BPE) and Samjiyon Airport (YJS).

Airport information

Origin Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E