Air Miles Calculator logo

How far is Kugaaruk from Resolute Bay?

The distance between Resolute Bay (Resolute Bay Airport) and Kugaaruk (Kugaaruk Airport) is 443 miles / 713 kilometers / 385 nautical miles.

The driving distance from Resolute Bay (YRB) to Kugaaruk (YBB) is 1 miles / 2 kilometers, and travel time by car is about 2 minutes.

Resolute Bay Airport – Kugaaruk Airport

Distance arrow
443
Miles
Distance arrow
713
Kilometers
Distance arrow
385
Nautical miles

Search flights

Distance from Resolute Bay to Kugaaruk

There are several ways to calculate the distance from Resolute Bay to Kugaaruk. Here are two standard methods:

Vincenty's formula (applied above)
  • 442.864 miles
  • 712.720 kilometers
  • 384.838 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 441.313 miles
  • 710.225 kilometers
  • 383.491 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Resolute Bay to Kugaaruk?

The estimated flight time from Resolute Bay Airport to Kugaaruk Airport is 1 hour and 20 minutes.

Flight carbon footprint between Resolute Bay Airport (YRB) and Kugaaruk Airport (YBB)

On average, flying from Resolute Bay to Kugaaruk generates about 90 kg of CO2 per passenger, and 90 kilograms equals 199 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Resolute Bay to Kugaaruk

See the map of the shortest flight path between Resolute Bay Airport (YRB) and Kugaaruk Airport (YBB).

Airport information

Origin Resolute Bay Airport
City: Resolute Bay
Country: Canada Flag of Canada
IATA Code: YRB
ICAO Code: CYRB
Coordinates: 74°43′0″N, 94°58′9″W
Destination Kugaaruk Airport
City: Kugaaruk
Country: Canada Flag of Canada
IATA Code: YBB
ICAO Code: CYBB
Coordinates: 68°32′3″N, 89°48′29″W