Air Miles Calculator logo

How far is Lismore from Mabuiag Island?

The distance between Mabuiag Island (Mabuiag Island Airport) and Lismore (Lismore Airport) is 1484 miles / 2389 kilometers / 1290 nautical miles.

The driving distance from Mabuiag Island (UBB) to Lismore (LSY) is 1807 miles / 2908 kilometers, and travel time by car is about 43 hours 46 minutes.

Mabuiag Island Airport – Lismore Airport

Distance arrow
1484
Miles
Distance arrow
2389
Kilometers
Distance arrow
1290
Nautical miles

Search flights

Distance from Mabuiag Island to Lismore

There are several ways to calculate the distance from Mabuiag Island to Lismore. Here are two standard methods:

Vincenty's formula (applied above)
  • 1484.189 miles
  • 2388.571 kilometers
  • 1289.725 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1488.706 miles
  • 2395.841 kilometers
  • 1293.650 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Mabuiag Island to Lismore?

The estimated flight time from Mabuiag Island Airport to Lismore Airport is 3 hours and 18 minutes.

What is the time difference between Mabuiag Island and Lismore?

There is no time difference between Mabuiag Island and Lismore.

Flight carbon footprint between Mabuiag Island Airport (UBB) and Lismore Airport (LSY)

On average, flying from Mabuiag Island to Lismore generates about 178 kg of CO2 per passenger, and 178 kilograms equals 393 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Mabuiag Island to Lismore

See the map of the shortest flight path between Mabuiag Island Airport (UBB) and Lismore Airport (LSY).

Airport information

Origin Mabuiag Island Airport
City: Mabuiag Island
Country: Australia Flag of Australia
IATA Code: UBB
ICAO Code: YMAA
Coordinates: 9°56′59″S, 142°10′58″E
Destination Lismore Airport
City: Lismore
Country: Australia Flag of Australia
IATA Code: LSY
ICAO Code: YLIS
Coordinates: 28°49′49″S, 153°15′35″E