Air Miles Calculator logo

How far is Wagga Wagga from Saibai Island?

The distance between Saibai Island (Saibai Island Airport) and Wagga Wagga (Wagga Wagga Airport) is 1801 miles / 2898 kilometers / 1565 nautical miles.

The driving distance from Saibai Island (SBR) to Wagga Wagga (WGA) is 2187 miles / 3520 kilometers, and travel time by car is about 52 hours 14 minutes.

Saibai Island Airport – Wagga Wagga Airport

Distance arrow
1801
Miles
Distance arrow
2898
Kilometers
Distance arrow
1565
Nautical miles

Search flights

Distance from Saibai Island to Wagga Wagga

There are several ways to calculate the distance from Saibai Island to Wagga Wagga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1800.729 miles
  • 2897.993 kilometers
  • 1564.791 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1807.716 miles
  • 2909.237 kilometers
  • 1570.863 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saibai Island to Wagga Wagga?

The estimated flight time from Saibai Island Airport to Wagga Wagga Airport is 3 hours and 54 minutes.

What is the time difference between Saibai Island and Wagga Wagga?

There is no time difference between Saibai Island and Wagga Wagga.

Flight carbon footprint between Saibai Island Airport (SBR) and Wagga Wagga Airport (WGA)

On average, flying from Saibai Island to Wagga Wagga generates about 200 kg of CO2 per passenger, and 200 kilograms equals 441 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Saibai Island to Wagga Wagga

See the map of the shortest flight path between Saibai Island Airport (SBR) and Wagga Wagga Airport (WGA).

Airport information

Origin Saibai Island Airport
City: Saibai Island
Country: Australia Flag of Australia
IATA Code: SBR
ICAO Code: YSII
Coordinates: 9°22′41″S, 142°37′30″E
Destination Wagga Wagga Airport
City: Wagga Wagga
Country: Australia Flag of Australia
IATA Code: WGA
ICAO Code: YSWG
Coordinates: 35°9′55″S, 147°27′57″E