Air Miles Calculator logo

How far is Kugaaruk from Kangiqsujuaq?

The distance between Kangiqsujuaq (Kangiqsujuaq (Wakeham Bay) Airport) and Kugaaruk (Kugaaruk Airport) is 706 miles / 1136 kilometers / 613 nautical miles.

Kangiqsujuaq (Wakeham Bay) Airport – Kugaaruk Airport

Distance arrow
706
Miles
Distance arrow
1136
Kilometers
Distance arrow
613
Nautical miles

Search flights

Distance from Kangiqsujuaq to Kugaaruk

There are several ways to calculate the distance from Kangiqsujuaq to Kugaaruk. Here are two standard methods:

Vincenty's formula (applied above)
  • 705.809 miles
  • 1135.889 kilometers
  • 613.331 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 703.475 miles
  • 1132.133 kilometers
  • 611.303 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kangiqsujuaq to Kugaaruk?

The estimated flight time from Kangiqsujuaq (Wakeham Bay) Airport to Kugaaruk Airport is 1 hour and 50 minutes.

Flight carbon footprint between Kangiqsujuaq (Wakeham Bay) Airport (YWB) and Kugaaruk Airport (YBB)

On average, flying from Kangiqsujuaq to Kugaaruk generates about 125 kg of CO2 per passenger, and 125 kilograms equals 276 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kangiqsujuaq to Kugaaruk

See the map of the shortest flight path between Kangiqsujuaq (Wakeham Bay) Airport (YWB) and Kugaaruk Airport (YBB).

Airport information

Origin Kangiqsujuaq (Wakeham Bay) Airport
City: Kangiqsujuaq
Country: Canada Flag of Canada
IATA Code: YWB
ICAO Code: CYKG
Coordinates: 61°35′18″N, 71°55′45″W
Destination Kugaaruk Airport
City: Kugaaruk
Country: Canada Flag of Canada
IATA Code: YBB
ICAO Code: CYBB
Coordinates: 68°32′3″N, 89°48′29″W