Air Miles Calculator logo

How far is Beijing from Lüliang?

The distance between Lüliang (Lüliang Dawu Airport) and Beijing (Beijing Daxing International Airport) is 312 miles / 502 kilometers / 271 nautical miles.

The driving distance from Lüliang (LLV) to Beijing (PKX) is 396 miles / 638 kilometers, and travel time by car is about 7 hours 15 minutes.

Lüliang Dawu Airport – Beijing Daxing International Airport

Distance arrow
312
Miles
Distance arrow
502
Kilometers
Distance arrow
271
Nautical miles

Search flights

Distance from Lüliang to Beijing

There are several ways to calculate the distance from Lüliang to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 311.659 miles
  • 501.567 kilometers
  • 270.824 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 311.114 miles
  • 500.690 kilometers
  • 270.351 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lüliang to Beijing?

The estimated flight time from Lüliang Dawu Airport to Beijing Daxing International Airport is 1 hour and 5 minutes.

What is the time difference between Lüliang and Beijing?

There is no time difference between Lüliang and Beijing.

Flight carbon footprint between Lüliang Dawu Airport (LLV) and Beijing Daxing International Airport (PKX)

On average, flying from Lüliang to Beijing generates about 71 kg of CO2 per passenger, and 71 kilograms equals 156 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lüliang to Beijing

See the map of the shortest flight path between Lüliang Dawu Airport (LLV) and Beijing Daxing International Airport (PKX).

Airport information

Origin Lüliang Dawu Airport
City: Lüliang
Country: China Flag of China
IATA Code: LLV
ICAO Code: ZBLL
Coordinates: 37°40′59″N, 111°8′34″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E