Air Miles Calculator logo

How far is Kugaaruk from Tuktoyaktuk?

The distance between Tuktoyaktuk (Tuktoyaktuk/James Gruben Airport) and Kugaaruk (Kugaaruk Airport) is 1055 miles / 1697 kilometers / 916 nautical miles.

Tuktoyaktuk/James Gruben Airport – Kugaaruk Airport

Distance arrow
1055
Miles
Distance arrow
1697
Kilometers
Distance arrow
916
Nautical miles

Search flights

Distance from Tuktoyaktuk to Kugaaruk

There are several ways to calculate the distance from Tuktoyaktuk to Kugaaruk. Here are two standard methods:

Vincenty's formula (applied above)
  • 1054.615 miles
  • 1697.238 kilometers
  • 916.435 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1050.362 miles
  • 1690.394 kilometers
  • 912.740 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tuktoyaktuk to Kugaaruk?

The estimated flight time from Tuktoyaktuk/James Gruben Airport to Kugaaruk Airport is 2 hours and 29 minutes.

What is the time difference between Tuktoyaktuk and Kugaaruk?

There is no time difference between Tuktoyaktuk and Kugaaruk.

Flight carbon footprint between Tuktoyaktuk/James Gruben Airport (YUB) and Kugaaruk Airport (YBB)

On average, flying from Tuktoyaktuk to Kugaaruk generates about 154 kg of CO2 per passenger, and 154 kilograms equals 340 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Tuktoyaktuk to Kugaaruk

See the map of the shortest flight path between Tuktoyaktuk/James Gruben Airport (YUB) and Kugaaruk Airport (YBB).

Airport information

Origin Tuktoyaktuk/James Gruben Airport
City: Tuktoyaktuk
Country: Canada Flag of Canada
IATA Code: YUB
ICAO Code: CYUB
Coordinates: 69°25′59″N, 133°1′33″W
Destination Kugaaruk Airport
City: Kugaaruk
Country: Canada Flag of Canada
IATA Code: YBB
ICAO Code: CYBB
Coordinates: 68°32′3″N, 89°48′29″W