Air Miles Calculator logo

How far is Samjiyon from Harbin?

The distance between Harbin (Harbin Taiping International Airport) and Samjiyon (Samjiyon Airport) is 278 miles / 448 kilometers / 242 nautical miles.

The driving distance from Harbin (HRB) to Samjiyon (YJS) is 374 miles / 602 kilometers, and travel time by car is about 7 hours 49 minutes.

Harbin Taiping International Airport – Samjiyon Airport

Distance arrow
278
Miles
Distance arrow
448
Kilometers
Distance arrow
242
Nautical miles

Search flights

Distance from Harbin to Samjiyon

There are several ways to calculate the distance from Harbin to Samjiyon. Here are two standard methods:

Vincenty's formula (applied above)
  • 278.369 miles
  • 447.991 kilometers
  • 241.896 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 278.441 miles
  • 448.107 kilometers
  • 241.959 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Harbin to Samjiyon?

The estimated flight time from Harbin Taiping International Airport to Samjiyon Airport is 1 hour and 1 minutes.

What is the time difference between Harbin and Samjiyon?

There is no time difference between Harbin and Samjiyon.

Flight carbon footprint between Harbin Taiping International Airport (HRB) and Samjiyon Airport (YJS)

On average, flying from Harbin to Samjiyon generates about 66 kg of CO2 per passenger, and 66 kilograms equals 145 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Harbin to Samjiyon

See the map of the shortest flight path between Harbin Taiping International Airport (HRB) and Samjiyon Airport (YJS).

Airport information

Origin Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E
Destination Samjiyon Airport
City: Samjiyon
Country: North Korea Flag of North Korea
IATA Code: YJS
ICAO Code: ZKSE
Coordinates: 41°54′25″N, 128°24′35″E