Air Miles Calculator logo

How far is Whyalla from Hervey Bay?

The distance between Hervey Bay (Hervey Bay Airport) and Whyalla (Whyalla Airport) is 1069 miles / 1720 kilometers / 929 nautical miles.

The driving distance from Hervey Bay (HVB) to Whyalla (WYA) is 1391 miles / 2238 kilometers, and travel time by car is about 27 hours 31 minutes.

Hervey Bay Airport – Whyalla Airport

Distance arrow
1069
Miles
Distance arrow
1720
Kilometers
Distance arrow
929
Nautical miles
Flight time duration
2 h 31 min
CO2 emission
155 kg

Search flights

Distance from Hervey Bay to Whyalla

There are several ways to calculate the distance from Hervey Bay to Whyalla. Here are two standard methods:

Vincenty's formula (applied above)
  • 1069.008 miles
  • 1720.401 kilometers
  • 928.942 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1068.326 miles
  • 1719.303 kilometers
  • 928.350 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hervey Bay to Whyalla?

The estimated flight time from Hervey Bay Airport to Whyalla Airport is 2 hours and 31 minutes.

Flight carbon footprint between Hervey Bay Airport (HVB) and Whyalla Airport (WYA)

On average, flying from Hervey Bay to Whyalla generates about 155 kg of CO2 per passenger, and 155 kilograms equals 342 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hervey Bay to Whyalla

See the map of the shortest flight path between Hervey Bay Airport (HVB) and Whyalla Airport (WYA).

Airport information

Origin Hervey Bay Airport
City: Hervey Bay
Country: Australia Flag of Australia
IATA Code: HVB
ICAO Code: YHBA
Coordinates: 25°19′8″S, 152°52′48″E
Destination Whyalla Airport
City: Whyalla
Country: Australia Flag of Australia
IATA Code: WYA
ICAO Code: YWHA
Coordinates: 33°3′32″S, 137°30′50″E