Air Miles Calculator logo

How far is Palm Island from Bamaga?

The distance between Bamaga (Northern Peninsula Airport) and Palm Island (Palm Island Airport) is 603 miles / 971 kilometers / 524 nautical miles.

The driving distance from Bamaga (ABM) to Palm Island (PMK) is 788 miles / 1268 kilometers, and travel time by car is about 32 hours 3 minutes.

Northern Peninsula Airport – Palm Island Airport

Distance arrow
603
Miles
Distance arrow
971
Kilometers
Distance arrow
524
Nautical miles

Search flights

Distance from Bamaga to Palm Island

There are several ways to calculate the distance from Bamaga to Palm Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 603.134 miles
  • 970.650 kilometers
  • 524.109 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 605.325 miles
  • 974.175 kilometers
  • 526.013 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bamaga to Palm Island?

The estimated flight time from Northern Peninsula Airport to Palm Island Airport is 1 hour and 38 minutes.

What is the time difference between Bamaga and Palm Island?

There is no time difference between Bamaga and Palm Island.

Flight carbon footprint between Northern Peninsula Airport (ABM) and Palm Island Airport (PMK)

On average, flying from Bamaga to Palm Island generates about 113 kg of CO2 per passenger, and 113 kilograms equals 249 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bamaga to Palm Island

See the map of the shortest flight path between Northern Peninsula Airport (ABM) and Palm Island Airport (PMK).

Airport information

Origin Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E
Destination Palm Island Airport
City: Palm Island
Country: Australia Flag of Australia
IATA Code: PMK
ICAO Code: YPAM
Coordinates: 18°45′19″S, 146°34′51″E