Air Miles Calculator logo

How far is Hervey Bay from Bamaga?

The distance between Bamaga (Northern Peninsula Airport) and Hervey Bay (Hervey Bay Airport) is 1201 miles / 1933 kilometers / 1044 nautical miles.

The driving distance from Bamaga (ABM) to Hervey Bay (HVB) is 1516 miles / 2439 kilometers, and travel time by car is about 37 hours 26 minutes.

Northern Peninsula Airport – Hervey Bay Airport

Distance arrow
1201
Miles
Distance arrow
1933
Kilometers
Distance arrow
1044
Nautical miles

Search flights

Distance from Bamaga to Hervey Bay

There are several ways to calculate the distance from Bamaga to Hervey Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1201.135 miles
  • 1933.040 kilometers
  • 1043.758 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1204.311 miles
  • 1938.150 kilometers
  • 1046.518 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bamaga to Hervey Bay?

The estimated flight time from Northern Peninsula Airport to Hervey Bay Airport is 2 hours and 46 minutes.

What is the time difference between Bamaga and Hervey Bay?

There is no time difference between Bamaga and Hervey Bay.

Flight carbon footprint between Northern Peninsula Airport (ABM) and Hervey Bay Airport (HVB)

On average, flying from Bamaga to Hervey Bay generates about 161 kg of CO2 per passenger, and 161 kilograms equals 356 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bamaga to Hervey Bay

See the map of the shortest flight path between Northern Peninsula Airport (ABM) and Hervey Bay Airport (HVB).

Airport information

Origin Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E
Destination Hervey Bay Airport
City: Hervey Bay
Country: Australia Flag of Australia
IATA Code: HVB
ICAO Code: YHBA
Coordinates: 25°19′8″S, 152°52′48″E