Air Miles Calculator logo

How far is Arctic Bay from Kangiqsualujjuaq?

The distance between Kangiqsualujjuaq (Kangiqsualujjuaq (Georges River) Airport) and Arctic Bay (Arctic Bay Airport) is 1116 miles / 1797 kilometers / 970 nautical miles.

Kangiqsualujjuaq (Georges River) Airport – Arctic Bay Airport

Distance arrow
1116
Miles
Distance arrow
1797
Kilometers
Distance arrow
970
Nautical miles

Search flights

Distance from Kangiqsualujjuaq to Arctic Bay

There are several ways to calculate the distance from Kangiqsualujjuaq to Arctic Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1116.480 miles
  • 1796.800 kilometers
  • 970.194 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1113.146 miles
  • 1791.434 kilometers
  • 967.297 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kangiqsualujjuaq to Arctic Bay?

The estimated flight time from Kangiqsualujjuaq (Georges River) Airport to Arctic Bay Airport is 2 hours and 36 minutes.

Flight carbon footprint between Kangiqsualujjuaq (Georges River) Airport (XGR) and Arctic Bay Airport (YAB)

On average, flying from Kangiqsualujjuaq to Arctic Bay generates about 158 kg of CO2 per passenger, and 158 kilograms equals 347 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kangiqsualujjuaq to Arctic Bay

See the map of the shortest flight path between Kangiqsualujjuaq (Georges River) Airport (XGR) and Arctic Bay Airport (YAB).

Airport information

Origin Kangiqsualujjuaq (Georges River) Airport
City: Kangiqsualujjuaq
Country: Canada Flag of Canada
IATA Code: XGR
ICAO Code: CYLU
Coordinates: 58°42′41″N, 65°59′34″W
Destination Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W