Air Miles Calculator logo

How far is Bella Coola from Sept-Iles?

The distance between Sept-Iles (Sept-Îles Airport) and Bella Coola (Bella Coola Airport) is 2542 miles / 4090 kilometers / 2209 nautical miles.

The driving distance from Sept-Iles (YZV) to Bella Coola (QBC) is 3588 miles / 5774 kilometers, and travel time by car is about 75 hours 48 minutes.

Sept-Îles Airport – Bella Coola Airport

Distance arrow
2542
Miles
Distance arrow
4090
Kilometers
Distance arrow
2209
Nautical miles

Search flights

Distance from Sept-Iles to Bella Coola

There are several ways to calculate the distance from Sept-Iles to Bella Coola. Here are two standard methods:

Vincenty's formula (applied above)
  • 2541.612 miles
  • 4090.327 kilometers
  • 2208.600 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2533.622 miles
  • 4077.469 kilometers
  • 2201.657 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sept-Iles to Bella Coola?

The estimated flight time from Sept-Îles Airport to Bella Coola Airport is 5 hours and 18 minutes.

Flight carbon footprint between Sept-Îles Airport (YZV) and Bella Coola Airport (QBC)

On average, flying from Sept-Iles to Bella Coola generates about 280 kg of CO2 per passenger, and 280 kilograms equals 617 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sept-Iles to Bella Coola

See the map of the shortest flight path between Sept-Îles Airport (YZV) and Bella Coola Airport (QBC).

Airport information

Origin Sept-Îles Airport
City: Sept-Iles
Country: Canada Flag of Canada
IATA Code: YZV
ICAO Code: CYZV
Coordinates: 50°13′23″N, 66°15′56″W
Destination Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W