Air Miles Calculator logo

How far is Bamaga from Hervey Bay?

The distance between Hervey Bay (Hervey Bay Airport) and Bamaga (Northern Peninsula Airport) is 1201 miles / 1933 kilometers / 1044 nautical miles.

The driving distance from Hervey Bay (HVB) to Bamaga (ABM) is 1515 miles / 2438 kilometers, and travel time by car is about 37 hours 20 minutes.

Hervey Bay Airport – Northern Peninsula Airport

Distance arrow
1201
Miles
Distance arrow
1933
Kilometers
Distance arrow
1044
Nautical miles

Search flights

Distance from Hervey Bay to Bamaga

There are several ways to calculate the distance from Hervey Bay to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1201.135 miles
  • 1933.040 kilometers
  • 1043.758 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1204.311 miles
  • 1938.150 kilometers
  • 1046.518 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hervey Bay to Bamaga?

The estimated flight time from Hervey Bay Airport to Northern Peninsula Airport is 2 hours and 46 minutes.

What is the time difference between Hervey Bay and Bamaga?

There is no time difference between Hervey Bay and Bamaga.

Flight carbon footprint between Hervey Bay Airport (HVB) and Northern Peninsula Airport (ABM)

On average, flying from Hervey Bay to Bamaga generates about 161 kg of CO2 per passenger, and 161 kilograms equals 356 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hervey Bay to Bamaga

See the map of the shortest flight path between Hervey Bay Airport (HVB) and Northern Peninsula Airport (ABM).

Airport information

Origin Hervey Bay Airport
City: Hervey Bay
Country: Australia Flag of Australia
IATA Code: HVB
ICAO Code: YHBA
Coordinates: 25°19′8″S, 152°52′48″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E