Air Miles Calculator logo

How far is Arctic Bay from Whitehorse?

The distance between Whitehorse (Erik Nielsen Whitehorse International Airport) and Arctic Bay (Arctic Bay Airport) is 1539 miles / 2476 kilometers / 1337 nautical miles.

Erik Nielsen Whitehorse International Airport – Arctic Bay Airport

Distance arrow
1539
Miles
Distance arrow
2476
Kilometers
Distance arrow
1337
Nautical miles

Search flights

Distance from Whitehorse to Arctic Bay

There are several ways to calculate the distance from Whitehorse to Arctic Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1538.744 miles
  • 2476.369 kilometers
  • 1337.132 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1533.204 miles
  • 2467.452 kilometers
  • 1332.318 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Whitehorse to Arctic Bay?

The estimated flight time from Erik Nielsen Whitehorse International Airport to Arctic Bay Airport is 3 hours and 24 minutes.

Flight carbon footprint between Erik Nielsen Whitehorse International Airport (YXY) and Arctic Bay Airport (YAB)

On average, flying from Whitehorse to Arctic Bay generates about 182 kg of CO2 per passenger, and 182 kilograms equals 401 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Whitehorse to Arctic Bay

See the map of the shortest flight path between Erik Nielsen Whitehorse International Airport (YXY) and Arctic Bay Airport (YAB).

Airport information

Origin Erik Nielsen Whitehorse International Airport
City: Whitehorse
Country: Canada Flag of Canada
IATA Code: YXY
ICAO Code: CYXY
Coordinates: 60°42′34″N, 135°4′1″W
Destination Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W