Air Miles Calculator logo

How far is Bamaga from Port Augusta?

The distance between Port Augusta (Port Augusta Airport) and Bamaga (Northern Peninsula Airport) is 1514 miles / 2436 kilometers / 1315 nautical miles.

The driving distance from Port Augusta (PUG) to Bamaga (ABM) is 2441 miles / 3929 kilometers, and travel time by car is about 55 hours 56 minutes.

Port Augusta Airport – Northern Peninsula Airport

Distance arrow
1514
Miles
Distance arrow
2436
Kilometers
Distance arrow
1315
Nautical miles
Flight time duration
3 h 21 min
CO2 emission
180 kg

Search flights

Distance from Port Augusta to Bamaga

There are several ways to calculate the distance from Port Augusta to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1513.701 miles
  • 2436.066 kilometers
  • 1315.370 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1519.629 miles
  • 2445.606 kilometers
  • 1320.522 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Port Augusta to Bamaga?

The estimated flight time from Port Augusta Airport to Northern Peninsula Airport is 3 hours and 21 minutes.

Flight carbon footprint between Port Augusta Airport (PUG) and Northern Peninsula Airport (ABM)

On average, flying from Port Augusta to Bamaga generates about 180 kg of CO2 per passenger, and 180 kilograms equals 398 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Port Augusta to Bamaga

See the map of the shortest flight path between Port Augusta Airport (PUG) and Northern Peninsula Airport (ABM).

Airport information

Origin Port Augusta Airport
City: Port Augusta
Country: Australia Flag of Australia
IATA Code: PUG
ICAO Code: YPAG
Coordinates: 32°30′24″S, 137°43′1″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E