Air Miles Calculator logo

How far is Brochet from Kingfisher Lake?

The distance between Kingfisher Lake (Kingfisher Lake Airport) and Brochet (Brochet Airport) is 573 miles / 922 kilometers / 498 nautical miles.

The driving distance from Kingfisher Lake (KIF) to Brochet (YBT) is 1283 miles / 2064 kilometers, and travel time by car is about 36 hours 2 minutes.

Kingfisher Lake Airport – Brochet Airport

Distance arrow
573
Miles
Distance arrow
922
Kilometers
Distance arrow
498
Nautical miles

Search flights

Distance from Kingfisher Lake to Brochet

There are several ways to calculate the distance from Kingfisher Lake to Brochet. Here are two standard methods:

Vincenty's formula (applied above)
  • 573.106 miles
  • 922.324 kilometers
  • 498.015 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 571.595 miles
  • 919.893 kilometers
  • 496.702 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kingfisher Lake to Brochet?

The estimated flight time from Kingfisher Lake Airport to Brochet Airport is 1 hour and 35 minutes.

Flight carbon footprint between Kingfisher Lake Airport (KIF) and Brochet Airport (YBT)

On average, flying from Kingfisher Lake to Brochet generates about 109 kg of CO2 per passenger, and 109 kilograms equals 241 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kingfisher Lake to Brochet

See the map of the shortest flight path between Kingfisher Lake Airport (KIF) and Brochet Airport (YBT).

Airport information

Origin Kingfisher Lake Airport
City: Kingfisher Lake
Country: Canada Flag of Canada
IATA Code: KIF
ICAO Code: CNM5
Coordinates: 53°0′45″N, 89°51′19″W
Destination Brochet Airport
City: Brochet
Country: Canada Flag of Canada
IATA Code: YBT
ICAO Code: CYBT
Coordinates: 57°53′21″N, 101°40′44″W