Air Miles Calculator logo

How far is Bella Coola from Saint John?

The distance between Saint John (Saint John Airport) and Bella Coola (Bella Coola Airport) is 2727 miles / 4389 kilometers / 2370 nautical miles.

The driving distance from Saint John (YSJ) to Bella Coola (QBC) is 3493 miles / 5621 kilometers, and travel time by car is about 74 hours 34 minutes.

Saint John Airport – Bella Coola Airport

Distance arrow
2727
Miles
Distance arrow
4389
Kilometers
Distance arrow
2370
Nautical miles

Search flights

Distance from Saint John to Bella Coola

There are several ways to calculate the distance from Saint John to Bella Coola. Here are two standard methods:

Vincenty's formula (applied above)
  • 2727.231 miles
  • 4389.053 kilometers
  • 2369.899 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2719.288 miles
  • 4376.269 kilometers
  • 2362.996 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saint John to Bella Coola?

The estimated flight time from Saint John Airport to Bella Coola Airport is 5 hours and 39 minutes.

Flight carbon footprint between Saint John Airport (YSJ) and Bella Coola Airport (QBC)

On average, flying from Saint John to Bella Coola generates about 302 kg of CO2 per passenger, and 302 kilograms equals 665 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Saint John to Bella Coola

See the map of the shortest flight path between Saint John Airport (YSJ) and Bella Coola Airport (QBC).

Airport information

Origin Saint John Airport
City: Saint John
Country: Canada Flag of Canada
IATA Code: YSJ
ICAO Code: CYSJ
Coordinates: 45°18′57″N, 65°53′25″W
Destination Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W