Air Miles Calculator logo

How far is Lopez, WA, from Bella Coola?

The distance between Bella Coola (Bella Coola Airport) and Lopez (Lopez Island Airport) is 314 miles / 506 kilometers / 273 nautical miles.

The driving distance from Bella Coola (QBC) to Lopez (LPS) is 656 miles / 1056 kilometers, and travel time by car is about 16 hours 7 minutes.

Bella Coola Airport – Lopez Island Airport

Distance arrow
314
Miles
Distance arrow
506
Kilometers
Distance arrow
273
Nautical miles

Search flights

Distance from Bella Coola to Lopez

There are several ways to calculate the distance from Bella Coola to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 314.354 miles
  • 505.903 kilometers
  • 273.166 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 314.009 miles
  • 505.348 kilometers
  • 272.866 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bella Coola to Lopez?

The estimated flight time from Bella Coola Airport to Lopez Island Airport is 1 hour and 5 minutes.

What is the time difference between Bella Coola and Lopez?

There is no time difference between Bella Coola and Lopez.

Flight carbon footprint between Bella Coola Airport (QBC) and Lopez Island Airport (LPS)

On average, flying from Bella Coola to Lopez generates about 71 kg of CO2 per passenger, and 71 kilograms equals 157 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bella Coola to Lopez

See the map of the shortest flight path between Bella Coola Airport (QBC) and Lopez Island Airport (LPS).

Airport information

Origin Bella Coola Airport
City: Bella Coola
Country: Canada Flag of Canada
IATA Code: QBC
ICAO Code: CYBD
Coordinates: 52°23′15″N, 126°35′45″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W