Air Miles Calculator logo

How far is Gander from Kangerlussuaq?

The distance between Kangerlussuaq (Kangerlussuaq Airport) and Gander (Gander International Airport) is 1258 miles / 2025 kilometers / 1093 nautical miles.

Kangerlussuaq Airport – Gander International Airport

Distance arrow
1258
Miles
Distance arrow
2025
Kilometers
Distance arrow
1093
Nautical miles
Flight time duration
2 h 52 min
Time Difference
1 h 30 min
CO2 emission
164 kg

Search flights

Distance from Kangerlussuaq to Gander

There are several ways to calculate the distance from Kangerlussuaq to Gander. Here are two standard methods:

Vincenty's formula (applied above)
  • 1258.317 miles
  • 2025.064 kilometers
  • 1093.447 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1256.276 miles
  • 2021.781 kilometers
  • 1091.674 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kangerlussuaq to Gander?

The estimated flight time from Kangerlussuaq Airport to Gander International Airport is 2 hours and 52 minutes.

Flight carbon footprint between Kangerlussuaq Airport (SFJ) and Gander International Airport (YQX)

On average, flying from Kangerlussuaq to Gander generates about 164 kg of CO2 per passenger, and 164 kilograms equals 362 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kangerlussuaq to Gander

See the map of the shortest flight path between Kangerlussuaq Airport (SFJ) and Gander International Airport (YQX).

Airport information

Origin Kangerlussuaq Airport
City: Kangerlussuaq
Country: Greenland Flag of Greenland
IATA Code: SFJ
ICAO Code: BGSF
Coordinates: 67°0′43″N, 50°42′41″W
Destination Gander International Airport
City: Gander
Country: Canada Flag of Canada
IATA Code: YQX
ICAO Code: CYQX
Coordinates: 48°56′12″N, 54°34′5″W