Air Miles Calculator logo

How far is Bangda from Golmud?

The distance between Golmud (Golmud Airport) and Bangda (Qamdo Bamda Airport) is 425 miles / 683 kilometers / 369 nautical miles.

The driving distance from Golmud (GOQ) to Bangda (BPX) is 728 miles / 1172 kilometers, and travel time by car is about 14 hours 40 minutes.

Golmud Airport – Qamdo Bamda Airport

Distance arrow
425
Miles
Distance arrow
683
Kilometers
Distance arrow
369
Nautical miles

Search flights

Distance from Golmud to Bangda

There are several ways to calculate the distance from Golmud to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 424.655 miles
  • 683.416 kilometers
  • 369.015 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 425.537 miles
  • 684.835 kilometers
  • 369.781 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Golmud to Bangda?

The estimated flight time from Golmud Airport to Qamdo Bamda Airport is 1 hour and 18 minutes.

What is the time difference between Golmud and Bangda?

There is no time difference between Golmud and Bangda.

Flight carbon footprint between Golmud Airport (GOQ) and Qamdo Bamda Airport (BPX)

On average, flying from Golmud to Bangda generates about 88 kg of CO2 per passenger, and 88 kilograms equals 193 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Golmud to Bangda

See the map of the shortest flight path between Golmud Airport (GOQ) and Qamdo Bamda Airport (BPX).

Airport information

Origin Golmud Airport
City: Golmud
Country: China Flag of China
IATA Code: GOQ
ICAO Code: ZLGM
Coordinates: 36°24′2″N, 94°47′9″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E