Air Miles Calculator logo

How far is Samjiyon from Hongping?

The distance between Hongping (Shennongjia Hongping Airport) and Samjiyon (Samjiyon Airport) is 1224 miles / 1969 kilometers / 1063 nautical miles.

The driving distance from Hongping (HPG) to Samjiyon (YJS) is 1563 miles / 2516 kilometers, and travel time by car is about 29 hours 43 minutes.

Shennongjia Hongping Airport – Samjiyon Airport

Distance arrow
1224
Miles
Distance arrow
1969
Kilometers
Distance arrow
1063
Nautical miles

Search flights

Distance from Hongping to Samjiyon

There are several ways to calculate the distance from Hongping to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1223.653 miles
  • 1969.278 kilometers
  • 1063.325 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1222.593 miles
  • 1967.573 kilometers
  • 1062.405 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hongping to Samjiyon?

The estimated flight time from Shennongjia Hongping Airport to Samjiyon Airport is 2 hours and 49 minutes.

What is the time difference between Hongping and Samjiyon?

There is no time difference between Hongping and Samjiyon.

Flight carbon footprint between Shennongjia Hongping Airport (HPG) and Samjiyon Airport (YJS)

On average, flying from Hongping to Samjiyon generates about 162 kg of CO2 per passenger, and 162 kilograms equals 358 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hongping to Samjiyon

See the map of the shortest flight path between Shennongjia Hongping Airport (HPG) and Samjiyon Airport (YJS).

Airport information

Origin Shennongjia Hongping Airport
City: Hongping
Country: China Flag of China
IATA Code: HPG
ICAO Code: ZHSN
Coordinates: 31°37′33″N, 110°20′24″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E