Air Miles Calculator logo

How far is Beijing from Ulanqab?

The distance between Ulanqab (Ulanqab Jining Airport) and Beijing (Beijing Daxing International Airport) is 207 miles / 333 kilometers / 180 nautical miles.

The driving distance from Ulanqab (UCB) to Beijing (PKX) is 255 miles / 410 kilometers, and travel time by car is about 4 hours 42 minutes.

Ulanqab Jining Airport – Beijing Daxing International Airport

Distance arrow
207
Miles
Distance arrow
333
Kilometers
Distance arrow
180
Nautical miles

Search flights

Distance from Ulanqab to Beijing

There are several ways to calculate the distance from Ulanqab to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 207.155 miles
  • 333.384 kilometers
  • 180.013 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 206.870 miles
  • 332.925 kilometers
  • 179.765 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ulanqab to Beijing?

The estimated flight time from Ulanqab Jining Airport to Beijing Daxing International Airport is 53 minutes.

What is the time difference between Ulanqab and Beijing?

There is no time difference between Ulanqab and Beijing.

Flight carbon footprint between Ulanqab Jining Airport (UCB) and Beijing Daxing International Airport (PKX)

On average, flying from Ulanqab to Beijing generates about 56 kg of CO2 per passenger, and 56 kilograms equals 122 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ulanqab to Beijing

See the map of the shortest flight path between Ulanqab Jining Airport (UCB) and Beijing Daxing International Airport (PKX).

Airport information

Origin Ulanqab Jining Airport
City: Ulanqab
Country: China Flag of China
IATA Code: UCB
ICAO Code: ZBUC
Coordinates: 41°7′46″N, 113°6′29″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E