Air Miles Calculator logo

How far is Whyalla from Mabuiag Island?

The distance between Mabuiag Island (Mabuiag Island Airport) and Whyalla (Whyalla Airport) is 1618 miles / 2603 kilometers / 1406 nautical miles.

The driving distance from Mabuiag Island (UBB) to Whyalla (WYA) is 2457 miles / 3954 kilometers, and travel time by car is about 57 hours 29 minutes.

Mabuiag Island Airport – Whyalla Airport

Distance arrow
1618
Miles
Distance arrow
2603
Kilometers
Distance arrow
1406
Nautical miles
Flight time duration
3 h 33 min
CO2 emission
187 kg

Search flights

Distance from Mabuiag Island to Whyalla

There are several ways to calculate the distance from Mabuiag Island to Whyalla. Here are two standard methods:

Vincenty's formula (applied above)
  • 1617.669 miles
  • 2603.386 kilometers
  • 1405.716 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1624.081 miles
  • 2613.706 kilometers
  • 1411.288 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Mabuiag Island to Whyalla?

The estimated flight time from Mabuiag Island Airport to Whyalla Airport is 3 hours and 33 minutes.

Flight carbon footprint between Mabuiag Island Airport (UBB) and Whyalla Airport (WYA)

On average, flying from Mabuiag Island to Whyalla generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Mabuiag Island to Whyalla

See the map of the shortest flight path between Mabuiag Island Airport (UBB) and Whyalla Airport (WYA).

Airport information

Origin Mabuiag Island Airport
City: Mabuiag Island
Country: Australia Flag of Australia
IATA Code: UBB
ICAO Code: YMAA
Coordinates: 9°56′59″S, 142°10′58″E
Destination Whyalla Airport
City: Whyalla
Country: Australia Flag of Australia
IATA Code: WYA
ICAO Code: YWHA
Coordinates: 33°3′32″S, 137°30′50″E