Air Miles Calculator logo

How far is Aasiaat from Whale Cove?

The distance between Whale Cove (Whale Cove Airport) and Aasiaat (Aasiaat Airport) is 1204 miles / 1938 kilometers / 1047 nautical miles.

Whale Cove Airport – Aasiaat Airport

Distance arrow
1204
Miles
Distance arrow
1938
Kilometers
Distance arrow
1047
Nautical miles

Search flights

Distance from Whale Cove to Aasiaat

There are several ways to calculate the distance from Whale Cove to Aasiaat. Here are two standard methods:

Vincenty's formula (applied above)
  • 1204.373 miles
  • 1938.251 kilometers
  • 1046.572 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1199.893 miles
  • 1931.041 kilometers
  • 1042.679 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Whale Cove to Aasiaat?

The estimated flight time from Whale Cove Airport to Aasiaat Airport is 2 hours and 46 minutes.

Flight carbon footprint between Whale Cove Airport (YXN) and Aasiaat Airport (JEG)

On average, flying from Whale Cove to Aasiaat generates about 162 kg of CO2 per passenger, and 162 kilograms equals 356 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Whale Cove to Aasiaat

See the map of the shortest flight path between Whale Cove Airport (YXN) and Aasiaat Airport (JEG).

Airport information

Origin Whale Cove Airport
City: Whale Cove
Country: Canada Flag of Canada
IATA Code: YXN
ICAO Code: CYXN
Coordinates: 62°14′24″N, 92°35′53″W
Destination Aasiaat Airport
City: Aasiaat
Country: Greenland Flag of Greenland
IATA Code: JEG
ICAO Code: BGAA
Coordinates: 68°43′18″N, 52°47′4″W