Air Miles Calculator logo

How far is Whyalla from Kalgoorlie?

The distance between Kalgoorlie (Kalgoorlie-Boulder Airport) and Whyalla (Whyalla Airport) is 955 miles / 1537 kilometers / 830 nautical miles.

The driving distance from Kalgoorlie (KGI) to Whyalla (WYA) is 1144 miles / 1841 kilometers, and travel time by car is about 21 hours 21 minutes.

Kalgoorlie-Boulder Airport – Whyalla Airport

Distance arrow
955
Miles
Distance arrow
1537
Kilometers
Distance arrow
830
Nautical miles
Flight time duration
2 h 18 min
Time Difference
1 h 30 min
CO2 emission
148 kg

Search flights

Distance from Kalgoorlie to Whyalla

There are several ways to calculate the distance from Kalgoorlie to Whyalla. Here are two standard methods:

Vincenty's formula (applied above)
  • 955.160 miles
  • 1537.180 kilometers
  • 830.011 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 953.323 miles
  • 1534.225 kilometers
  • 828.415 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kalgoorlie to Whyalla?

The estimated flight time from Kalgoorlie-Boulder Airport to Whyalla Airport is 2 hours and 18 minutes.

Flight carbon footprint between Kalgoorlie-Boulder Airport (KGI) and Whyalla Airport (WYA)

On average, flying from Kalgoorlie to Whyalla generates about 148 kg of CO2 per passenger, and 148 kilograms equals 326 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kalgoorlie to Whyalla

See the map of the shortest flight path between Kalgoorlie-Boulder Airport (KGI) and Whyalla Airport (WYA).

Airport information

Origin Kalgoorlie-Boulder Airport
City: Kalgoorlie
Country: Australia Flag of Australia
IATA Code: KGI
ICAO Code: YPKG
Coordinates: 30°47′21″S, 121°27′43″E
Destination Whyalla Airport
City: Whyalla
Country: Australia Flag of Australia
IATA Code: WYA
ICAO Code: YWHA
Coordinates: 33°3′32″S, 137°30′50″E