Air Miles Calculator logo

How far is Brochet from Gods Lake Narrows?

The distance between Gods Lake Narrows (Gods Lake Narrows Airport) and Brochet (Brochet Airport) is 360 miles / 579 kilometers / 313 nautical miles.

Gods Lake Narrows Airport – Brochet Airport

Distance arrow
360
Miles
Distance arrow
579
Kilometers
Distance arrow
313
Nautical miles

Search flights

Distance from Gods Lake Narrows to Brochet

There are several ways to calculate the distance from Gods Lake Narrows to Brochet. Here are two standard methods:

Vincenty's formula (applied above)
  • 360.018 miles
  • 579.392 kilometers
  • 312.847 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 359.089 miles
  • 577.898 kilometers
  • 312.040 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Gods Lake Narrows to Brochet?

The estimated flight time from Gods Lake Narrows Airport to Brochet Airport is 1 hour and 10 minutes.

What is the time difference between Gods Lake Narrows and Brochet?

There is no time difference between Gods Lake Narrows and Brochet.

Flight carbon footprint between Gods Lake Narrows Airport (YGO) and Brochet Airport (YBT)

On average, flying from Gods Lake Narrows to Brochet generates about 78 kg of CO2 per passenger, and 78 kilograms equals 172 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Gods Lake Narrows to Brochet

See the map of the shortest flight path between Gods Lake Narrows Airport (YGO) and Brochet Airport (YBT).

Airport information

Origin Gods Lake Narrows Airport
City: Gods Lake Narrows
Country: Canada Flag of Canada
IATA Code: YGO
ICAO Code: CYGO
Coordinates: 54°33′32″N, 94°29′29″W
Destination Brochet Airport
City: Brochet
Country: Canada Flag of Canada
IATA Code: YBT
ICAO Code: CYBT
Coordinates: 57°53′21″N, 101°40′44″W