Air Miles Calculator logo

How far is Bella Coola from Ogoki Post?

The distance between Ogoki Post (Ogoki Post Airport) and Bella Coola (Bella Coola Airport) is 1713 miles / 2757 kilometers / 1489 nautical miles.

The driving distance from Ogoki Post (YOG) to Bella Coola (QBC) is 2347 miles / 3777 kilometers, and travel time by car is about 52 hours 29 minutes.

Ogoki Post Airport – Bella Coola Airport

Distance arrow
1713
Miles
Distance arrow
2757
Kilometers
Distance arrow
1489
Nautical miles

Search flights

Distance from Ogoki Post to Bella Coola

There are several ways to calculate the distance from Ogoki Post to Bella Coola. Here are two standard methods:

Vincenty's formula (applied above)
  • 1713.377 miles
  • 2757.413 kilometers
  • 1488.884 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1707.904 miles
  • 2748.605 kilometers
  • 1484.128 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ogoki Post to Bella Coola?

The estimated flight time from Ogoki Post Airport to Bella Coola Airport is 3 hours and 44 minutes.

Flight carbon footprint between Ogoki Post Airport (YOG) and Bella Coola Airport (QBC)

On average, flying from Ogoki Post to Bella Coola generates about 194 kg of CO2 per passenger, and 194 kilograms equals 427 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ogoki Post to Bella Coola

See the map of the shortest flight path between Ogoki Post Airport (YOG) and Bella Coola Airport (QBC).

Airport information

Origin Ogoki Post Airport
City: Ogoki Post
Country: Canada Flag of Canada
IATA Code: YOG
ICAO Code: CNT3
Coordinates: 51°39′30″N, 85°54′6″W
Destination Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W