Air Miles Calculator logo

How far is Port Augusta from Burnie?

The distance between Burnie (Burnie Airport) and Port Augusta (Port Augusta Airport) is 735 miles / 1182 kilometers / 638 nautical miles.

The driving distance from Burnie (BWT) to Port Augusta (PUG) is 959 miles / 1543 kilometers, and travel time by car is about 22 hours 55 minutes.

Burnie Airport – Port Augusta Airport

Distance arrow
735
Miles
Distance arrow
1182
Kilometers
Distance arrow
638
Nautical miles
Flight time duration
1 h 53 min
CO2 emission
128 kg

Search flights

Distance from Burnie to Port Augusta

There are several ways to calculate the distance from Burnie to Port Augusta. Here are two standard methods:

Vincenty's formula (applied above)
  • 734.540 miles
  • 1182.128 kilometers
  • 638.298 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 734.854 miles
  • 1182.633 kilometers
  • 638.571 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Burnie to Port Augusta?

The estimated flight time from Burnie Airport to Port Augusta Airport is 1 hour and 53 minutes.

Flight carbon footprint between Burnie Airport (BWT) and Port Augusta Airport (PUG)

On average, flying from Burnie to Port Augusta generates about 128 kg of CO2 per passenger, and 128 kilograms equals 283 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Burnie to Port Augusta

See the map of the shortest flight path between Burnie Airport (BWT) and Port Augusta Airport (PUG).

Airport information

Origin Burnie Airport
City: Burnie
Country: Australia Flag of Australia
IATA Code: BWT
ICAO Code: YWYY
Coordinates: 40°59′56″S, 145°43′51″E
Destination Port Augusta Airport
City: Port Augusta
Country: Australia Flag of Australia
IATA Code: PUG
ICAO Code: YPAG
Coordinates: 32°30′24″S, 137°43′1″E