Air Miles Calculator logo

How far is Mabuiag Island from Bourke?

The distance between Bourke (Bourke Airport) and Mabuiag Island (Mabuiag Island Airport) is 1403 miles / 2258 kilometers / 1219 nautical miles.

The driving distance from Bourke (BRK) to Mabuiag Island (UBB) is 1763 miles / 2837 kilometers, and travel time by car is about 44 hours 13 minutes.

Bourke Airport – Mabuiag Island Airport

Distance arrow
1403
Miles
Distance arrow
2258
Kilometers
Distance arrow
1219
Nautical miles

Search flights

Distance from Bourke to Mabuiag Island

There are several ways to calculate the distance from Bourke to Mabuiag Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 1403.269 miles
  • 2258.342 kilometers
  • 1219.407 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1409.132 miles
  • 2267.779 kilometers
  • 1224.503 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bourke to Mabuiag Island?

The estimated flight time from Bourke Airport to Mabuiag Island Airport is 3 hours and 9 minutes.

Flight carbon footprint between Bourke Airport (BRK) and Mabuiag Island Airport (UBB)

On average, flying from Bourke to Mabuiag Island generates about 173 kg of CO2 per passenger, and 173 kilograms equals 382 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bourke to Mabuiag Island

See the map of the shortest flight path between Bourke Airport (BRK) and Mabuiag Island Airport (UBB).

Airport information

Origin Bourke Airport
City: Bourke
Country: Australia Flag of Australia
IATA Code: BRK
ICAO Code: YBKE
Coordinates: 30°2′21″S, 145°57′7″E
Destination Mabuiag Island Airport
City: Mabuiag Island
Country: Australia Flag of Australia
IATA Code: UBB
ICAO Code: YMAA
Coordinates: 9°56′59″S, 142°10′58″E