Air Miles Calculator logo

Distance between Qiqihar (NDG) and Bangda (BPX)

Flight distance from Qiqihar to Bangda (Qiqihar Sanjiazi Airport – Qamdo Bamda Airport) is 1831 miles / 2947 kilometers / 1592 nautical miles. Estimated flight time is 3 hours 58 minutes.

Driving distance from Qiqihar (NDG) to Bangda (BPX) is 2477 miles / 3987 kilometers and travel time by car is about 47 hours 18 minutes.

Qiqihar – Bangda

Distance arrow
1831
Miles
Distance arrow
2947
Kilometers
Distance arrow
1592
Nautical miles

How far is Bangda from Qiqihar?

There are several ways to calculate distances between Los Angeles and Chicago. Here are two common methods:

Vincenty's formula (applied above)
  • 1831.489 miles
  • 2947.496 kilometers
  • 1591.521 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
  • 1829.980 miles
  • 2945.067 kilometers
  • 1590.209 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

Flight Duration

Estimated flight time from Qiqihar Sanjiazi Airport to Qamdo Bamda Airport is 3 hours 58 minutes.

Time difference

The time difference between Qiqihar and Bangda is 2 hours. Bangda is 2 hours behind Qiqihar.
Qiqihar time to Bangda time converter

Carbon dioxide emissions

On average flying from Qiqihar to Bangda generates about 203 kg of CO2 per passenger, 203 kilograms is equal to 447 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Qiqihar to Bangda

Shortest flight path between Qiqihar Sanjiazi Airport (NDG) and Qamdo Bamda Airport (BPX).

Airport information

Origin Qiqihar Sanjiazi Airport
City: Qiqihar
Country: China Flag of China
IATA Code: NDG
ICAO Code: ZYQQ
Coordinates: 47°14′22″N, 123°55′4″E
Destination Qamdo Bamda Airport
City: Bangda
Country: China Flag of China
IATA Code: BPX
ICAO Code: ZUBD
Coordinates: 30°33′12″N, 97°6′29″E