Air Miles Calculator logo

How far is Bangda from Jiansanjiang?

The distance between Jiansanjiang (Jiansanjiang Airport) and Bangda (Qamdo Bamda Airport) is 2205 miles / 3548 kilometers / 1916 nautical miles.

The driving distance from Jiansanjiang (JSJ) to Bangda (BPX) is 2884 miles / 4642 kilometers, and travel time by car is about 52 hours 25 minutes.

Jiansanjiang Airport – Qamdo Bamda Airport

Distance arrow
2205
Miles
Distance arrow
3548
Kilometers
Distance arrow
1916
Nautical miles

Search flights

Distance from Jiansanjiang to Bangda

There are several ways to calculate the distance from Jiansanjiang to Bangda. Here are two standard methods:

Vincenty's formula (applied above)
  • 2204.667 miles
  • 3548.067 kilometers
  • 1915.803 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2201.734 miles
  • 3543.347 kilometers
  • 1913.255 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jiansanjiang to Bangda?

The estimated flight time from Jiansanjiang Airport to Qamdo Bamda Airport is 4 hours and 40 minutes.

Flight carbon footprint between Jiansanjiang Airport (JSJ) and Qamdo Bamda Airport (BPX)

On average, flying from Jiansanjiang to Bangda generates about 241 kg of CO2 per passenger, and 241 kilograms equals 531 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jiansanjiang to Bangda

See the map of the shortest flight path between Jiansanjiang Airport (JSJ) and Qamdo Bamda Airport (BPX).

Airport information

Origin Jiansanjiang Airport
City: Jiansanjiang
Country: China Flag of China
IATA Code: JSJ
ICAO Code: ZYJS
Coordinates: 47°6′36″N, 132°39′37″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E