Air Miles Calculator logo

How far is Grayling, AK, from King Salmon, AK?

The distance between King Salmon (King Salmon Airport) and Grayling (Grayling Airport) is 314 miles / 505 kilometers / 273 nautical miles.

King Salmon Airport – Grayling Airport

Distance arrow
314
Miles
Distance arrow
505
Kilometers
Distance arrow
273
Nautical miles

Search flights

Distance from King Salmon to Grayling

There are several ways to calculate the distance from King Salmon to Grayling. Here are two standard methods:

Vincenty's formula (applied above)
  • 314.028 miles
  • 505.379 kilometers
  • 272.883 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 313.312 miles
  • 504.226 kilometers
  • 272.260 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from King Salmon to Grayling?

The estimated flight time from King Salmon Airport to Grayling Airport is 1 hour and 5 minutes.

What is the time difference between King Salmon and Grayling?

There is no time difference between King Salmon and Grayling.

Flight carbon footprint between King Salmon Airport (AKN) and Grayling Airport (KGX)

On average, flying from King Salmon to Grayling generates about 71 kg of CO2 per passenger, and 71 kilograms equals 157 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from King Salmon to Grayling

See the map of the shortest flight path between King Salmon Airport (AKN) and Grayling Airport (KGX).

Airport information

Origin King Salmon Airport
City: King Salmon, AK
Country: United States Flag of United States
IATA Code: AKN
ICAO Code: PAKN
Coordinates: 58°40′36″N, 156°38′56″W
Destination Grayling Airport
City: Grayling, AK
Country: United States Flag of United States
IATA Code: KGX
ICAO Code: PAGX
Coordinates: 62°53′42″N, 160°3′58″W