Air Miles Calculator logo

How far is Mabuiag Island from Badu Island?

The distance between Badu Island (Badu Island Airport) and Mabuiag Island (Mabuiag Island Airport) is 14 miles / 22 kilometers / 12 nautical miles.

Badu Island Airport – Mabuiag Island Airport

Distance arrow
14
Miles
Distance arrow
22
Kilometers
Distance arrow
12
Nautical miles

Search flights

Distance from Badu Island to Mabuiag Island

There are several ways to calculate the distance from Badu Island to Mabuiag Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 13.761 miles
  • 22.147 kilometers
  • 11.958 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 13.834 miles
  • 22.264 kilometers
  • 12.021 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Badu Island to Mabuiag Island?

The estimated flight time from Badu Island Airport to Mabuiag Island Airport is 31 minutes.

What is the time difference between Badu Island and Mabuiag Island?

There is no time difference between Badu Island and Mabuiag Island.

Flight carbon footprint between Badu Island Airport (BDD) and Mabuiag Island Airport (UBB)

On average, flying from Badu Island to Mabuiag Island generates about 27 kg of CO2 per passenger, and 27 kilograms equals 60 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Badu Island to Mabuiag Island

See the map of the shortest flight path between Badu Island Airport (BDD) and Mabuiag Island Airport (UBB).

Airport information

Origin Badu Island Airport
City: Badu Island
Country: Australia Flag of Australia
IATA Code: BDD
ICAO Code: YBAU
Coordinates: 10°8′59″S, 142°10′24″E
Destination Mabuiag Island Airport
City: Mabuiag Island
Country: Australia Flag of Australia
IATA Code: UBB
ICAO Code: YMAA
Coordinates: 9°56′59″S, 142°10′58″E