Air Miles Calculator logo

How far is Bamaga from Wagga Wagga?

The distance between Wagga Wagga (Wagga Wagga Airport) and Bamaga (Northern Peninsula Airport) is 1696 miles / 2729 kilometers / 1474 nautical miles.

The driving distance from Wagga Wagga (WGA) to Bamaga (ABM) is 2167 miles / 3487 kilometers, and travel time by car is about 51 hours 31 minutes.

Wagga Wagga Airport – Northern Peninsula Airport

Distance arrow
1696
Miles
Distance arrow
2729
Kilometers
Distance arrow
1474
Nautical miles

Search flights

Distance from Wagga Wagga to Bamaga

There are several ways to calculate the distance from Wagga Wagga to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1695.991 miles
  • 2729.432 kilometers
  • 1473.776 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1702.382 miles
  • 2739.719 kilometers
  • 1479.330 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wagga Wagga to Bamaga?

The estimated flight time from Wagga Wagga Airport to Northern Peninsula Airport is 3 hours and 42 minutes.

What is the time difference between Wagga Wagga and Bamaga?

There is no time difference between Wagga Wagga and Bamaga.

Flight carbon footprint between Wagga Wagga Airport (WGA) and Northern Peninsula Airport (ABM)

On average, flying from Wagga Wagga to Bamaga generates about 192 kg of CO2 per passenger, and 192 kilograms equals 424 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wagga Wagga to Bamaga

See the map of the shortest flight path between Wagga Wagga Airport (WGA) and Northern Peninsula Airport (ABM).

Airport information

Origin Wagga Wagga Airport
City: Wagga Wagga
Country: Australia Flag of Australia
IATA Code: WGA
ICAO Code: YSWG
Coordinates: 35°9′55″S, 147°27′57″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E