Air Miles Calculator logo

How far is Beijing from Cheongju?

The distance between Cheongju (Cheongju International Airport) and Beijing (Beijing Nanyuan Airport) is 640 miles / 1030 kilometers / 556 nautical miles.

The driving distance from Cheongju (CJJ) to Beijing (NAY) is 901 miles / 1450 kilometers, and travel time by car is about 17 hours 7 minutes.

Cheongju International Airport – Beijing Nanyuan Airport

Distance arrow
640
Miles
Distance arrow
1030
Kilometers
Distance arrow
556
Nautical miles

Search flights

Distance from Cheongju to Beijing

There are several ways to calculate the distance from Cheongju to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 639.735 miles
  • 1029.554 kilometers
  • 555.915 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 638.491 miles
  • 1027.551 kilometers
  • 554.833 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cheongju to Beijing?

The estimated flight time from Cheongju International Airport to Beijing Nanyuan Airport is 1 hour and 42 minutes.

Flight carbon footprint between Cheongju International Airport (CJJ) and Beijing Nanyuan Airport (NAY)

On average, flying from Cheongju to Beijing generates about 118 kg of CO2 per passenger, and 118 kilograms equals 259 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Cheongju to Beijing

See the map of the shortest flight path between Cheongju International Airport (CJJ) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Cheongju International Airport
City: Cheongju
Country: South Korea Flag of South Korea
IATA Code: CJJ
ICAO Code: RKTU
Coordinates: 36°42′59″N, 127°29′56″E
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E