Air Miles Calculator logo

How far is Bangda from Ejin Banner?

The distance between Ejin Banner (Ejin Banner Taolai Airport) and Bangda (Qamdo Bamda Airport) is 819 miles / 1319 kilometers / 712 nautical miles.

The driving distance from Ejin Banner (EJN) to Bangda (BPX) is 1371 miles / 2206 kilometers, and travel time by car is about 26 hours 40 minutes.

Ejin Banner Taolai Airport – Qamdo Bamda Airport

Distance arrow
819
Miles
Distance arrow
1319
Kilometers
Distance arrow
712
Nautical miles

Search flights

Distance from Ejin Banner to Bangda

There are several ways to calculate the distance from Ejin Banner to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 819.373 miles
  • 1318.653 kilometers
  • 712.016 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 820.821 miles
  • 1320.983 kilometers
  • 713.274 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ejin Banner to Bangda?

The estimated flight time from Ejin Banner Taolai Airport to Qamdo Bamda Airport is 2 hours and 3 minutes.

Flight carbon footprint between Ejin Banner Taolai Airport (EJN) and Qamdo Bamda Airport (BPX)

On average, flying from Ejin Banner to Bangda generates about 137 kg of CO2 per passenger, and 137 kilograms equals 301 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ejin Banner to Bangda

See the map of the shortest flight path between Ejin Banner Taolai Airport (EJN) and Qamdo Bamda Airport (BPX).

Airport information

Origin Ejin Banner Taolai Airport
City: Ejin Banner
Country: China Flag of China
IATA Code: EJN
ICAO Code: ZBEN
Coordinates: 42°0′55″N, 101°0′1″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E