Air Miles Calculator logo

How far is Brandon from Kugluktuk?

The distance between Kugluktuk (Kugluktuk Airport) and Brandon (Brandon Municipal Airport) is 1345 miles / 2165 kilometers / 1169 nautical miles.

The driving distance from Kugluktuk (YCO) to Brandon (YBR) is 2030 miles / 3267 kilometers, and travel time by car is about 44 hours 13 minutes.

Kugluktuk Airport – Brandon Municipal Airport

Distance arrow
1345
Miles
Distance arrow
2165
Kilometers
Distance arrow
1169
Nautical miles

Search flights

Distance from Kugluktuk to Brandon

There are several ways to calculate the distance from Kugluktuk to Brandon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1345.036 miles
  • 2164.626 kilometers
  • 1168.805 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1342.339 miles
  • 2160.285 kilometers
  • 1166.460 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kugluktuk to Brandon?

The estimated flight time from Kugluktuk Airport to Brandon Municipal Airport is 3 hours and 2 minutes.

Flight carbon footprint between Kugluktuk Airport (YCO) and Brandon Municipal Airport (YBR)

On average, flying from Kugluktuk to Brandon generates about 170 kg of CO2 per passenger, and 170 kilograms equals 374 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kugluktuk to Brandon

See the map of the shortest flight path between Kugluktuk Airport (YCO) and Brandon Municipal Airport (YBR).

Airport information

Origin Kugluktuk Airport
City: Kugluktuk
Country: Canada Flag of Canada
IATA Code: YCO
ICAO Code: CYCO
Coordinates: 67°49′0″N, 115°8′38″W
Destination Brandon Municipal Airport
City: Brandon
Country: Canada Flag of Canada
IATA Code: YBR
ICAO Code: CYBR
Coordinates: 49°54′36″N, 99°57′6″W