Air Miles Calculator logo

How far is Iqaluit from Kangerlussuaq?

The distance between Kangerlussuaq (Kangerlussuaq Airport) and Iqaluit (Iqaluit Airport) is 560 miles / 902 kilometers / 487 nautical miles. The estimated flight time is 1 hour and 33 minutes.

Kangerlussuaq Airport – Iqaluit Airport

Distance arrow
560
Miles
Distance arrow
902
Kilometers
Distance arrow
487
Nautical miles

Distance from Kangerlussuaq to Iqaluit

There are several ways to calculate the distance from Kangerlussuaq to Iqaluit. Here are two standard methods:

Vincenty's formula (applied above)
  • 560.187 miles
  • 901.533 kilometers
  • 486.789 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 558.118 miles
  • 898.203 kilometers
  • 484.991 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kangerlussuaq to Iqaluit?

The estimated flight time from Kangerlussuaq Airport to Iqaluit Airport is 1 hour and 33 minutes.

What is the time difference between Kangerlussuaq and Iqaluit?

The time difference between Kangerlussuaq and Iqaluit is 2 hours. Iqaluit is 2 hours behind Kangerlussuaq.

Flight carbon footprint between Kangerlussuaq Airport (SFJ) and Iqaluit Airport (YFB)

On average, flying from Kangerlussuaq to Iqaluit generates about 107 kg of CO2 per passenger, and 107 kilograms equals 237 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kangerlussuaq to Iqaluit

Shortest flight path between Kangerlussuaq Airport (SFJ) and Iqaluit Airport (YFB).

Airport information

Origin Kangerlussuaq Airport
City: Kangerlussuaq
Country: Greenland Flag of Greenland
IATA Code: SFJ
ICAO Code: BGSF
Coordinates: 67°0′43″N, 50°42′41″W
Destination Iqaluit Airport
City: Iqaluit
Country: Canada Flag of Canada
IATA Code: YFB
ICAO Code: CYFB
Coordinates: 63°45′23″N, 68°33′20″W