Air Miles Calculator logo

How far is Bamaga from Kubin Island?

The distance between Kubin Island (Kubin Airport) and Bamaga (Northern Peninsula Airport) is 53 miles / 85 kilometers / 46 nautical miles.

The driving distance from Kubin Island (KUG) to Bamaga (ABM) is 25 miles / 41 kilometers, and travel time by car is about 59 minutes.

Kubin Airport – Northern Peninsula Airport

Distance arrow
53
Miles
Distance arrow
85
Kilometers
Distance arrow
46
Nautical miles

Search flights

Distance from Kubin Island to Bamaga

There are several ways to calculate the distance from Kubin Island to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 52.508 miles
  • 84.503 kilometers
  • 45.628 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 52.752 miles
  • 84.895 kilometers
  • 45.840 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kubin Island to Bamaga?

The estimated flight time from Kubin Airport to Northern Peninsula Airport is 35 minutes.

What is the time difference between Kubin Island and Bamaga?

There is no time difference between Kubin Island and Bamaga.

Flight carbon footprint between Kubin Airport (KUG) and Northern Peninsula Airport (ABM)

On average, flying from Kubin Island to Bamaga generates about 33 kg of CO2 per passenger, and 33 kilograms equals 72 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kubin Island to Bamaga

See the map of the shortest flight path between Kubin Airport (KUG) and Northern Peninsula Airport (ABM).

Airport information

Origin Kubin Airport
City: Kubin Island
Country: Australia Flag of Australia
IATA Code: KUG
ICAO Code: YKUB
Coordinates: 10°13′30″S, 142°13′4″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E