Air Miles Calculator logo

How far is Whyalla from Saibai Island?

The distance between Saibai Island (Saibai Island Airport) and Whyalla (Whyalla Airport) is 1662 miles / 2674 kilometers / 1444 nautical miles.

The driving distance from Saibai Island (SBR) to Whyalla (WYA) is 2457 miles / 3954 kilometers, and travel time by car is about 57 hours 29 minutes.

Saibai Island Airport – Whyalla Airport

Distance arrow
1662
Miles
Distance arrow
2674
Kilometers
Distance arrow
1444
Nautical miles
Flight time duration
3 h 38 min
CO2 emission
190 kg

Search flights

Distance from Saibai Island to Whyalla

There are several ways to calculate the distance from Saibai Island to Whyalla. Here are two standard methods:

Vincenty's formula (applied above)
  • 1661.693 miles
  • 2674.236 kilometers
  • 1443.972 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1668.281 miles
  • 2684.838 kilometers
  • 1449.697 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saibai Island to Whyalla?

The estimated flight time from Saibai Island Airport to Whyalla Airport is 3 hours and 38 minutes.

Flight carbon footprint between Saibai Island Airport (SBR) and Whyalla Airport (WYA)

On average, flying from Saibai Island to Whyalla generates about 190 kg of CO2 per passenger, and 190 kilograms equals 419 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Saibai Island to Whyalla

See the map of the shortest flight path between Saibai Island Airport (SBR) and Whyalla Airport (WYA).

Airport information

Origin Saibai Island Airport
City: Saibai Island
Country: Australia Flag of Australia
IATA Code: SBR
ICAO Code: YSII
Coordinates: 9°22′41″S, 142°37′30″E
Destination Whyalla Airport
City: Whyalla
Country: Australia Flag of Australia
IATA Code: WYA
ICAO Code: YWHA
Coordinates: 33°3′32″S, 137°30′50″E