Air Miles Calculator logo

How far is Bamaga from Lord Howe Island?

The distance between Lord Howe Island (Lord Howe Island Airport) and Bamaga (Northern Peninsula Airport) is 1771 miles / 2850 kilometers / 1539 nautical miles.

The driving distance from Lord Howe Island (LDH) to Bamaga (ABM) is 1965 miles / 3163 kilometers, and travel time by car is about 46 hours 11 minutes.

Lord Howe Island Airport – Northern Peninsula Airport

Distance arrow
1771
Miles
Distance arrow
2850
Kilometers
Distance arrow
1539
Nautical miles

Search flights

Distance from Lord Howe Island to Bamaga

There are several ways to calculate the distance from Lord Howe Island to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1770.928 miles
  • 2850.032 kilometers
  • 1538.894 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1774.700 miles
  • 2856.102 kilometers
  • 1542.172 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lord Howe Island to Bamaga?

The estimated flight time from Lord Howe Island Airport to Northern Peninsula Airport is 3 hours and 51 minutes.

Flight carbon footprint between Lord Howe Island Airport (LDH) and Northern Peninsula Airport (ABM)

On average, flying from Lord Howe Island to Bamaga generates about 198 kg of CO2 per passenger, and 198 kilograms equals 436 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lord Howe Island to Bamaga

See the map of the shortest flight path between Lord Howe Island Airport (LDH) and Northern Peninsula Airport (ABM).

Airport information

Origin Lord Howe Island Airport
City: Lord Howe Island
Country: Australia Flag of Australia
IATA Code: LDH
ICAO Code: YLHI
Coordinates: 31°32′17″S, 159°4′37″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E