Air Miles Calculator logo

How far is Baise from Bole?

The distance between Bole (Alashankou Bole (Bortala) airport) and Baise (Baise Bama Airport) is 2014 miles / 3241 kilometers / 1750 nautical miles.

The driving distance from Bole (BPL) to Baise (AEB) is 2560 miles / 4120 kilometers, and travel time by car is about 46 hours 34 minutes.

Alashankou Bole (Bortala) airport – Baise Bama Airport

Distance arrow
2014
Miles
Distance arrow
3241
Kilometers
Distance arrow
1750
Nautical miles

Search flights

Distance from Bole to Baise

There are several ways to calculate the distance from Bole to Baise. Here are two standard methods:

Vincenty's formula (applied above)
  • 2013.725 miles
  • 3240.776 kilometers
  • 1749.879 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2014.159 miles
  • 3241.475 kilometers
  • 1750.256 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bole to Baise?

The estimated flight time from Alashankou Bole (Bortala) airport to Baise Bama Airport is 4 hours and 18 minutes.

Flight carbon footprint between Alashankou Bole (Bortala) airport (BPL) and Baise Bama Airport (AEB)

On average, flying from Bole to Baise generates about 219 kg of CO2 per passenger, and 219 kilograms equals 483 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bole to Baise

See the map of the shortest flight path between Alashankou Bole (Bortala) airport (BPL) and Baise Bama Airport (AEB).

Airport information

Origin Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E
Destination Baise Bama Airport
City: Baise
Country: China Flag of China
IATA Code: AEB
ICAO Code: ZGBS
Coordinates: 23°43′14″N, 106°57′35″E