Air Miles Calculator logo

How far is Bamaga from Geraldton?

The distance between Geraldton (Geraldton Airport) and Bamaga (Northern Peninsula Airport) is 2174 miles / 3498 kilometers / 1889 nautical miles.

The driving distance from Geraldton (GET) to Bamaga (ABM) is 3877 miles / 6240 kilometers, and travel time by car is about 82 hours 17 minutes.

Geraldton Airport – Northern Peninsula Airport

Distance arrow
2174
Miles
Distance arrow
3498
Kilometers
Distance arrow
1889
Nautical miles

Search flights

Distance from Geraldton to Bamaga

There are several ways to calculate the distance from Geraldton to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 2173.529 miles
  • 3497.955 kilometers
  • 1888.745 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2174.330 miles
  • 3499.245 kilometers
  • 1889.441 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Geraldton to Bamaga?

The estimated flight time from Geraldton Airport to Northern Peninsula Airport is 4 hours and 36 minutes.

Flight carbon footprint between Geraldton Airport (GET) and Northern Peninsula Airport (ABM)

On average, flying from Geraldton to Bamaga generates about 237 kg of CO2 per passenger, and 237 kilograms equals 523 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Geraldton to Bamaga

See the map of the shortest flight path between Geraldton Airport (GET) and Northern Peninsula Airport (ABM).

Airport information

Origin Geraldton Airport
City: Geraldton
Country: Australia Flag of Australia
IATA Code: GET
ICAO Code: YGEL
Coordinates: 28°47′45″S, 114°42′25″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E