Air Miles Calculator logo

How far is St. Lewis from Bella Coola?

The distance between Bella Coola (Bella Coola Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 2875 miles / 4627 kilometers / 2499 nautical miles.

The driving distance from Bella Coola (QBC) to St. Lewis (YFX) is 4396 miles / 7074 kilometers, and travel time by car is about 97 hours 18 minutes.

Bella Coola Airport – St. Lewis (Fox Harbour) Airport

Distance arrow
2875
Miles
Distance arrow
4627
Kilometers
Distance arrow
2499
Nautical miles
Flight time duration
5 h 56 min
Time Difference
4 h 30 min
CO2 emission
319 kg

Search flights

Distance from Bella Coola to St. Lewis

There are several ways to calculate the distance from Bella Coola to St. Lewis. Here are two standard methods:

Vincenty's formula (applied above)
  • 2875.288 miles
  • 4627.327 kilometers
  • 2498.557 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2866.052 miles
  • 4612.464 kilometers
  • 2490.531 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bella Coola to St. Lewis?

The estimated flight time from Bella Coola Airport to St. Lewis (Fox Harbour) Airport is 5 hours and 56 minutes.

Flight carbon footprint between Bella Coola Airport (QBC) and St. Lewis (Fox Harbour) Airport (YFX)

On average, flying from Bella Coola to St. Lewis generates about 319 kg of CO2 per passenger, and 319 kilograms equals 704 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bella Coola to St. Lewis

See the map of the shortest flight path between Bella Coola Airport (QBC) and St. Lewis (Fox Harbour) Airport (YFX).

Airport information

Origin Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W
Destination St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W