Air Miles Calculator logo

How far is Saibai Island from Kalgoorlie?

The distance between Kalgoorlie (Kalgoorlie-Boulder Airport) and Saibai Island (Saibai Island Airport) is 2007 miles / 3230 kilometers / 1744 nautical miles.

The driving distance from Kalgoorlie (KGI) to Saibai Island (SBR) is 3343 miles / 5380 kilometers, and travel time by car is about 77 hours 21 minutes.

Kalgoorlie-Boulder Airport – Saibai Island Airport

Distance arrow
2007
Miles
Distance arrow
3230
Kilometers
Distance arrow
1744
Nautical miles

Search flights

Distance from Kalgoorlie to Saibai Island

There are several ways to calculate the distance from Kalgoorlie to Saibai Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 2006.933 miles
  • 3229.845 kilometers
  • 1743.977 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2010.209 miles
  • 3235.117 kilometers
  • 1746.823 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kalgoorlie to Saibai Island?

The estimated flight time from Kalgoorlie-Boulder Airport to Saibai Island Airport is 4 hours and 17 minutes.

Flight carbon footprint between Kalgoorlie-Boulder Airport (KGI) and Saibai Island Airport (SBR)

On average, flying from Kalgoorlie to Saibai Island generates about 219 kg of CO2 per passenger, and 219 kilograms equals 482 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kalgoorlie to Saibai Island

See the map of the shortest flight path between Kalgoorlie-Boulder Airport (KGI) and Saibai Island Airport (SBR).

Airport information

Origin Kalgoorlie-Boulder Airport
City: Kalgoorlie
Country: Australia Flag of Australia
IATA Code: KGI
ICAO Code: YPKG
Coordinates: 30°47′21″S, 121°27′43″E
Destination Saibai Island Airport
City: Saibai Island
Country: Australia Flag of Australia
IATA Code: SBR
ICAO Code: YSII
Coordinates: 9°22′41″S, 142°37′30″E