Air Miles Calculator logo

How far is Mabuiag Island from Losuia?

The distance between Losuia (Losuia Airport) and Mabuiag Island (Mabuiag Island Airport) is 616 miles / 991 kilometers / 535 nautical miles.

Losuia Airport – Mabuiag Island Airport

Distance arrow
616
Miles
Distance arrow
991
Kilometers
Distance arrow
535
Nautical miles

Search flights

Distance from Losuia to Mabuiag Island

There are several ways to calculate the distance from Losuia to Mabuiag Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 615.589 miles
  • 990.694 kilometers
  • 534.932 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 614.953 miles
  • 989.671 kilometers
  • 534.379 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Losuia to Mabuiag Island?

The estimated flight time from Losuia Airport to Mabuiag Island Airport is 1 hour and 39 minutes.

What is the time difference between Losuia and Mabuiag Island?

There is no time difference between Losuia and Mabuiag Island.

Flight carbon footprint between Losuia Airport (LSA) and Mabuiag Island Airport (UBB)

On average, flying from Losuia to Mabuiag Island generates about 115 kg of CO2 per passenger, and 115 kilograms equals 253 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Losuia to Mabuiag Island

See the map of the shortest flight path between Losuia Airport (LSA) and Mabuiag Island Airport (UBB).

Airport information

Origin Losuia Airport
City: Losuia
Country: Papua New Guinea Flag of Papua New Guinea
IATA Code: LSA
ICAO Code: AYKA
Coordinates: 8°30′20″S, 151°4′51″E
Destination Mabuiag Island Airport
City: Mabuiag Island
Country: Australia Flag of Australia
IATA Code: UBB
ICAO Code: YMAA
Coordinates: 9°56′59″S, 142°10′58″E