Air Miles Calculator logo

How far is Bamaga from Parkes?

The distance between Parkes (Parkes Airport) and Bamaga (Northern Peninsula Airport) is 1570 miles / 2526 kilometers / 1364 nautical miles.

The driving distance from Parkes (PKE) to Bamaga (ABM) is 1993 miles / 3207 kilometers, and travel time by car is about 47 hours 30 minutes.

Parkes Airport – Northern Peninsula Airport

Distance arrow
1570
Miles
Distance arrow
2526
Kilometers
Distance arrow
1364
Nautical miles

Search flights

Distance from Parkes to Bamaga

There are several ways to calculate the distance from Parkes to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1569.836 miles
  • 2526.405 kilometers
  • 1364.150 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1575.786 miles
  • 2535.982 kilometers
  • 1369.321 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Parkes to Bamaga?

The estimated flight time from Parkes Airport to Northern Peninsula Airport is 3 hours and 28 minutes.

What is the time difference between Parkes and Bamaga?

There is no time difference between Parkes and Bamaga.

Flight carbon footprint between Parkes Airport (PKE) and Northern Peninsula Airport (ABM)

On average, flying from Parkes to Bamaga generates about 184 kg of CO2 per passenger, and 184 kilograms equals 405 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Parkes to Bamaga

See the map of the shortest flight path between Parkes Airport (PKE) and Northern Peninsula Airport (ABM).

Airport information

Origin Parkes Airport
City: Parkes
Country: Australia Flag of Australia
IATA Code: PKE
ICAO Code: YPKS
Coordinates: 33°7′53″S, 148°14′20″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E