Air Miles Calculator logo

How far is Whyalla from Launceston?

The distance between Launceston (Launceston Airport) and Whyalla (Whyalla Airport) is 791 miles / 1274 kilometers / 688 nautical miles.

The driving distance from Launceston (LST) to Whyalla (WYA) is 1029 miles / 1656 kilometers, and travel time by car is about 24 hours 9 minutes.

Launceston Airport – Whyalla Airport

Distance arrow
791
Miles
Distance arrow
1274
Kilometers
Distance arrow
688
Nautical miles
Flight time duration
1 h 59 min
CO2 emission
134 kg

Search flights

Distance from Launceston to Whyalla

There are several ways to calculate the distance from Launceston to Whyalla. Here are two standard methods:

Vincenty's formula (applied above)
  • 791.437 miles
  • 1273.694 kilometers
  • 687.740 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 791.422 miles
  • 1273.670 kilometers
  • 687.727 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Launceston to Whyalla?

The estimated flight time from Launceston Airport to Whyalla Airport is 1 hour and 59 minutes.

Flight carbon footprint between Launceston Airport (LST) and Whyalla Airport (WYA)

On average, flying from Launceston to Whyalla generates about 134 kg of CO2 per passenger, and 134 kilograms equals 295 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Launceston to Whyalla

See the map of the shortest flight path between Launceston Airport (LST) and Whyalla Airport (WYA).

Airport information

Origin Launceston Airport
City: Launceston
Country: Australia Flag of Australia
IATA Code: LST
ICAO Code: YMLT
Coordinates: 41°32′43″S, 147°12′50″E
Destination Whyalla Airport
City: Whyalla
Country: Australia Flag of Australia
IATA Code: WYA
ICAO Code: YWHA
Coordinates: 33°3′32″S, 137°30′50″E