Air Miles Calculator logo

How far is Bella Coola from Comox?

The distance between Comox (CFB Comox) and Bella Coola (Bella Coola Airport) is 199 miles / 321 kilometers / 173 nautical miles.

The driving distance from Comox (YQQ) to Bella Coola (QBC) is 347 miles / 558 kilometers, and travel time by car is about 14 hours 19 minutes.

CFB Comox – Bella Coola Airport

Distance arrow
199
Miles
Distance arrow
321
Kilometers
Distance arrow
173
Nautical miles

Search flights

Distance from Comox to Bella Coola

There are several ways to calculate the distance from Comox to Bella Coola. Here are two standard methods:

Vincenty's formula (applied above)
  • 199.440 miles
  • 320.967 kilometers
  • 173.308 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 199.269 miles
  • 320.693 kilometers
  • 173.160 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Comox to Bella Coola?

The estimated flight time from CFB Comox to Bella Coola Airport is 52 minutes.

What is the time difference between Comox and Bella Coola?

There is no time difference between Comox and Bella Coola.

Flight carbon footprint between CFB Comox (YQQ) and Bella Coola Airport (QBC)

On average, flying from Comox to Bella Coola generates about 54 kg of CO2 per passenger, and 54 kilograms equals 120 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Comox to Bella Coola

See the map of the shortest flight path between CFB Comox (YQQ) and Bella Coola Airport (QBC).

Airport information

Origin CFB Comox
City: Comox
Country: Canada Flag of Canada
IATA Code: YQQ
ICAO Code: CYQQ
Coordinates: 49°42′38″N, 124°53′13″W
Destination Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W