Air Miles Calculator logo

How far is Beijing from Jeju?

The distance between Jeju (Jeju International Airport) and Beijing (Beijing Nanyuan Airport) is 708 miles / 1139 kilometers / 615 nautical miles.

The driving distance from Jeju (CJU) to Beijing (NAY) is 1158 miles / 1863 kilometers, and travel time by car is about 22 hours 9 minutes.

Jeju International Airport – Beijing Nanyuan Airport

Distance arrow
708
Miles
Distance arrow
1139
Kilometers
Distance arrow
615
Nautical miles

Search flights

Distance from Jeju to Beijing

There are several ways to calculate the distance from Jeju to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 707.907 miles
  • 1139.265 kilometers
  • 615.154 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 707.417 miles
  • 1138.477 kilometers
  • 614.728 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jeju to Beijing?

The estimated flight time from Jeju International Airport to Beijing Nanyuan Airport is 1 hour and 50 minutes.

Flight carbon footprint between Jeju International Airport (CJU) and Beijing Nanyuan Airport (NAY)

On average, flying from Jeju to Beijing generates about 125 kg of CO2 per passenger, and 125 kilograms equals 277 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jeju to Beijing

See the map of the shortest flight path between Jeju International Airport (CJU) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Jeju International Airport
City: Jeju
Country: South Korea Flag of South Korea
IATA Code: CJU
ICAO Code: RKPC
Coordinates: 33°30′40″N, 126°29′34″E
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E