Air Miles Calculator logo

How far is Whitehorse from Grise Fiord?

The distance between Grise Fiord (Grise Fiord Airport) and Whitehorse (Erik Nielsen Whitehorse International Airport) is 1618 miles / 2604 kilometers / 1406 nautical miles.

Grise Fiord Airport – Erik Nielsen Whitehorse International Airport

Distance arrow
1618
Miles
Distance arrow
2604
Kilometers
Distance arrow
1406
Nautical miles

Search flights

Distance from Grise Fiord to Whitehorse

There are several ways to calculate the distance from Grise Fiord to Whitehorse. Here are two standard methods:

Vincenty's formula (applied above)
  • 1617.773 miles
  • 2603.553 kilometers
  • 1405.806 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1612.009 miles
  • 2594.276 kilometers
  • 1400.797 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Grise Fiord to Whitehorse?

The estimated flight time from Grise Fiord Airport to Erik Nielsen Whitehorse International Airport is 3 hours and 33 minutes.

Flight carbon footprint between Grise Fiord Airport (YGZ) and Erik Nielsen Whitehorse International Airport (YXY)

On average, flying from Grise Fiord to Whitehorse generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Grise Fiord to Whitehorse

See the map of the shortest flight path between Grise Fiord Airport (YGZ) and Erik Nielsen Whitehorse International Airport (YXY).

Airport information

Origin Grise Fiord Airport
City: Grise Fiord
Country: Canada Flag of Canada
IATA Code: YGZ
ICAO Code: CYGZ
Coordinates: 76°25′33″N, 82°54′33″W
Destination Erik Nielsen Whitehorse International Airport
City: Whitehorse
Country: Canada Flag of Canada
IATA Code: YXY
ICAO Code: CYXY
Coordinates: 60°42′34″N, 135°4′1″W