Air Miles Calculator logo

How far is Cranbrook from Akutan, AK?

The distance between Akutan (Akutan Seaplane Base) and Cranbrook (Cranbrook/Canadian Rockies International Airport) is 2117 miles / 3407 kilometers / 1840 nautical miles.

Akutan Seaplane Base – Cranbrook/Canadian Rockies International Airport

Distance arrow
2117
Miles
Distance arrow
3407
Kilometers
Distance arrow
1840
Nautical miles

Search flights

Distance from Akutan to Cranbrook

There are several ways to calculate the distance from Akutan to Cranbrook. Here are two standard methods:

Vincenty's formula (applied above)
  • 2116.898 miles
  • 3406.817 kilometers
  • 1839.534 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2110.275 miles
  • 3396.158 kilometers
  • 1833.778 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Akutan to Cranbrook?

The estimated flight time from Akutan Seaplane Base to Cranbrook/Canadian Rockies International Airport is 4 hours and 30 minutes.

Flight carbon footprint between Akutan Seaplane Base (KQA) and Cranbrook/Canadian Rockies International Airport (YXC)

On average, flying from Akutan to Cranbrook generates about 231 kg of CO2 per passenger, and 231 kilograms equals 509 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Akutan to Cranbrook

See the map of the shortest flight path between Akutan Seaplane Base (KQA) and Cranbrook/Canadian Rockies International Airport (YXC).

Airport information

Origin Akutan Seaplane Base
City: Akutan, AK
Country: United States Flag of United States
IATA Code: KQA
ICAO Code: KQA
Coordinates: 54°7′56″N, 165°47′6″W
Destination Cranbrook/Canadian Rockies International Airport
City: Cranbrook
Country: Canada Flag of Canada
IATA Code: YXC
ICAO Code: CYXC
Coordinates: 49°36′38″N, 115°46′55″W