Air Miles Calculator logo

How far is Port Augusta from King Island, Tasmania?

The distance between King Island, Tasmania (King Island Airport) and Port Augusta (Port Augusta Airport) is 613 miles / 987 kilometers / 533 nautical miles.

King Island Airport – Port Augusta Airport

Distance arrow
613
Miles
Distance arrow
987
Kilometers
Distance arrow
533
Nautical miles
Flight time duration
1 h 39 min
CO2 emission
114 kg

Search flights

Distance from King Island, Tasmania to Port Augusta

There are several ways to calculate the distance from King Island, Tasmania to Port Augusta. Here are two standard methods:

Vincenty's formula (applied above)
  • 613.483 miles
  • 987.306 kilometers
  • 533.103 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 613.926 miles
  • 988.018 kilometers
  • 533.487 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from King Island, Tasmania to Port Augusta?

The estimated flight time from King Island Airport to Port Augusta Airport is 1 hour and 39 minutes.

Flight carbon footprint between King Island Airport (KNS) and Port Augusta Airport (PUG)

On average, flying from King Island, Tasmania to Port Augusta generates about 114 kg of CO2 per passenger, and 114 kilograms equals 252 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from King Island, Tasmania to Port Augusta

See the map of the shortest flight path between King Island Airport (KNS) and Port Augusta Airport (PUG).

Airport information

Origin King Island Airport
City: King Island, Tasmania
Country: Australia Flag of Australia
IATA Code: KNS
ICAO Code: YKII
Coordinates: 39°52′38″S, 143°52′40″E
Destination Port Augusta Airport
City: Port Augusta
Country: Australia Flag of Australia
IATA Code: PUG
ICAO Code: YPAG
Coordinates: 32°30′24″S, 137°43′1″E