Air Miles Calculator logo

How far is Brandon from Ogoki Post?

The distance between Ogoki Post (Ogoki Post Airport) and Brandon (Brandon Municipal Airport) is 626 miles / 1008 kilometers / 544 nautical miles.

The driving distance from Ogoki Post (YOG) to Brandon (YBR) is 821 miles / 1321 kilometers, and travel time by car is about 20 hours 33 minutes.

Ogoki Post Airport – Brandon Municipal Airport

Distance arrow
626
Miles
Distance arrow
1008
Kilometers
Distance arrow
544
Nautical miles

Search flights

Distance from Ogoki Post to Brandon

There are several ways to calculate the distance from Ogoki Post to Brandon. Here are two standard methods:

Vincenty's formula (applied above)
  • 626.402 miles
  • 1008.097 kilometers
  • 544.329 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 624.507 miles
  • 1005.046 kilometers
  • 542.682 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ogoki Post to Brandon?

The estimated flight time from Ogoki Post Airport to Brandon Municipal Airport is 1 hour and 41 minutes.

Flight carbon footprint between Ogoki Post Airport (YOG) and Brandon Municipal Airport (YBR)

On average, flying from Ogoki Post to Brandon generates about 116 kg of CO2 per passenger, and 116 kilograms equals 256 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ogoki Post to Brandon

See the map of the shortest flight path between Ogoki Post Airport (YOG) and Brandon Municipal Airport (YBR).

Airport information

Origin Ogoki Post Airport
City: Ogoki Post
Country: Canada Flag of Canada
IATA Code: YOG
ICAO Code: CNT3
Coordinates: 51°39′30″N, 85°54′6″W
Destination Brandon Municipal Airport
City: Brandon
Country: Canada Flag of Canada
IATA Code: YBR
ICAO Code: CYBR
Coordinates: 49°54′36″N, 99°57′6″W