Air Miles Calculator logo

How far is Brochet from North Spirit Lake?

The distance between North Spirit Lake (North Spirit Lake Airport) and Brochet (Brochet Airport) is 507 miles / 817 kilometers / 441 nautical miles.

North Spirit Lake Airport – Brochet Airport

Distance arrow
507
Miles
Distance arrow
817
Kilometers
Distance arrow
441
Nautical miles

Search flights

Distance from North Spirit Lake to Brochet

There are several ways to calculate the distance from North Spirit Lake to Brochet. Here are two standard methods:

Vincenty's formula (applied above)
  • 507.429 miles
  • 816.628 kilometers
  • 440.944 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 506.321 miles
  • 814.845 kilometers
  • 439.981 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from North Spirit Lake to Brochet?

The estimated flight time from North Spirit Lake Airport to Brochet Airport is 1 hour and 27 minutes.

What is the time difference between North Spirit Lake and Brochet?

There is no time difference between North Spirit Lake and Brochet.

Flight carbon footprint between North Spirit Lake Airport (YNO) and Brochet Airport (YBT)

On average, flying from North Spirit Lake to Brochet generates about 100 kg of CO2 per passenger, and 100 kilograms equals 220 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from North Spirit Lake to Brochet

See the map of the shortest flight path between North Spirit Lake Airport (YNO) and Brochet Airport (YBT).

Airport information

Origin North Spirit Lake Airport
City: North Spirit Lake
Country: Canada Flag of Canada
IATA Code: YNO
ICAO Code: CKQ3
Coordinates: 52°29′24″N, 92°58′15″W
Destination Brochet Airport
City: Brochet
Country: Canada Flag of Canada
IATA Code: YBT
ICAO Code: CYBT
Coordinates: 57°53′21″N, 101°40′44″W