Air Miles Calculator logo

How far is Ji'an from Bole?

The distance between Bole (Alashankou Bole (Bortala) airport) and Ji'an (Jinggangshan Airport) is 2181 miles / 3510 kilometers / 1895 nautical miles.

The driving distance from Bole (BPL) to Ji'an (JGS) is 2632 miles / 4236 kilometers, and travel time by car is about 47 hours 38 minutes.

Alashankou Bole (Bortala) airport – Jinggangshan Airport

Distance arrow
2181
Miles
Distance arrow
3510
Kilometers
Distance arrow
1895
Nautical miles

Search flights

Distance from Bole to Ji'an

There are several ways to calculate the distance from Bole to Ji'an. Here are two standard methods:

Vincenty's formula (applied above)
  • 2181.218 miles
  • 3510.331 kilometers
  • 1895.427 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2179.408 miles
  • 3507.417 kilometers
  • 1893.854 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bole to Ji'an?

The estimated flight time from Alashankou Bole (Bortala) airport to Jinggangshan Airport is 4 hours and 37 minutes.

Flight carbon footprint between Alashankou Bole (Bortala) airport (BPL) and Jinggangshan Airport (JGS)

On average, flying from Bole to Ji'an generates about 238 kg of CO2 per passenger, and 238 kilograms equals 525 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bole to Ji'an

See the map of the shortest flight path between Alashankou Bole (Bortala) airport (BPL) and Jinggangshan Airport (JGS).

Airport information

Origin Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E
Destination Jinggangshan Airport
City: Ji'an
Country: China Flag of China
IATA Code: JGS
ICAO Code: ZSJA
Coordinates: 26°51′24″N, 114°44′13″E