Air Miles Calculator logo

How far is Bamaga from Kalgoorlie?

The distance between Kalgoorlie (Kalgoorlie-Boulder Airport) and Bamaga (Northern Peninsula Airport) is 1918 miles / 3086 kilometers / 1666 nautical miles.

The driving distance from Kalgoorlie (KGI) to Bamaga (ABM) is 3322 miles / 5347 kilometers, and travel time by car is about 76 hours 34 minutes.

Kalgoorlie-Boulder Airport – Northern Peninsula Airport

Distance arrow
1918
Miles
Distance arrow
3086
Kilometers
Distance arrow
1666
Nautical miles

Search flights

Distance from Kalgoorlie to Bamaga

There are several ways to calculate the distance from Kalgoorlie to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1917.548 miles
  • 3085.994 kilometers
  • 1666.303 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1920.219 miles
  • 3090.292 kilometers
  • 1668.624 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kalgoorlie to Bamaga?

The estimated flight time from Kalgoorlie-Boulder Airport to Northern Peninsula Airport is 4 hours and 7 minutes.

Flight carbon footprint between Kalgoorlie-Boulder Airport (KGI) and Northern Peninsula Airport (ABM)

On average, flying from Kalgoorlie to Bamaga generates about 210 kg of CO2 per passenger, and 210 kilograms equals 463 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kalgoorlie to Bamaga

See the map of the shortest flight path between Kalgoorlie-Boulder Airport (KGI) and Northern Peninsula Airport (ABM).

Airport information

Origin Kalgoorlie-Boulder Airport
City: Kalgoorlie
Country: Australia Flag of Australia
IATA Code: KGI
ICAO Code: YPKG
Coordinates: 30°47′21″S, 121°27′43″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E