Air Miles Calculator logo

How far is Bamaga from Barcaldine?

The distance between Barcaldine (Barcaldine Airport) and Bamaga (Northern Peninsula Airport) is 888 miles / 1428 kilometers / 771 nautical miles.

The driving distance from Barcaldine (BCI) to Bamaga (ABM) is 1212 miles / 1950 kilometers, and travel time by car is about 33 hours 13 minutes.

Barcaldine Airport – Northern Peninsula Airport

Distance arrow
888
Miles
Distance arrow
1428
Kilometers
Distance arrow
771
Nautical miles

Search flights

Distance from Barcaldine to Bamaga

There are several ways to calculate the distance from Barcaldine to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 887.581 miles
  • 1428.423 kilometers
  • 771.287 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 891.504 miles
  • 1434.736 kilometers
  • 774.695 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Barcaldine to Bamaga?

The estimated flight time from Barcaldine Airport to Northern Peninsula Airport is 2 hours and 10 minutes.

What is the time difference between Barcaldine and Bamaga?

There is no time difference between Barcaldine and Bamaga.

Flight carbon footprint between Barcaldine Airport (BCI) and Northern Peninsula Airport (ABM)

On average, flying from Barcaldine to Bamaga generates about 143 kg of CO2 per passenger, and 143 kilograms equals 314 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Barcaldine to Bamaga

See the map of the shortest flight path between Barcaldine Airport (BCI) and Northern Peninsula Airport (ABM).

Airport information

Origin Barcaldine Airport
City: Barcaldine
Country: Australia Flag of Australia
IATA Code: BCI
ICAO Code: YBAR
Coordinates: 23°33′55″S, 145°18′25″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E