Air Miles Calculator logo

How far is Bamaga from Kununurra?

The distance between Kununurra (East Kimberley Regional Airport) and Bamaga (Northern Peninsula Airport) is 983 miles / 1582 kilometers / 854 nautical miles.

The driving distance from Kununurra (KNX) to Bamaga (ABM) is 2070 miles / 3332 kilometers, and travel time by car is about 48 hours 34 minutes.

East Kimberley Regional Airport – Northern Peninsula Airport

Distance arrow
983
Miles
Distance arrow
1582
Kilometers
Distance arrow
854
Nautical miles

Search flights

Distance from Kununurra to Bamaga

There are several ways to calculate the distance from Kununurra to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 982.854 miles
  • 1581.751 kilometers
  • 854.077 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 982.294 miles
  • 1580.849 kilometers
  • 853.590 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kununurra to Bamaga?

The estimated flight time from East Kimberley Regional Airport to Northern Peninsula Airport is 2 hours and 21 minutes.

Flight carbon footprint between East Kimberley Regional Airport (KNX) and Northern Peninsula Airport (ABM)

On average, flying from Kununurra to Bamaga generates about 150 kg of CO2 per passenger, and 150 kilograms equals 330 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kununurra to Bamaga

See the map of the shortest flight path between East Kimberley Regional Airport (KNX) and Northern Peninsula Airport (ABM).

Airport information

Origin East Kimberley Regional Airport
City: Kununurra
Country: Australia Flag of Australia
IATA Code: KNX
ICAO Code: YPKU
Coordinates: 15°46′41″S, 128°42′28″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E