Air Miles Calculator logo

How far is Campbell River from Wabush?

The distance between Wabush (Wabush Airport) and Campbell River (Campbell River Airport) is 2462 miles / 3962 kilometers / 2139 nautical miles.

The driving distance from Wabush (YWK) to Campbell River (YBL) is 3836 miles / 6174 kilometers, and travel time by car is about 77 hours 12 minutes.

Wabush Airport – Campbell River Airport

Distance arrow
2462
Miles
Distance arrow
3962
Kilometers
Distance arrow
2139
Nautical miles

Search flights

Distance from Wabush to Campbell River

There are several ways to calculate the distance from Wabush to Campbell River. Here are two standard methods:

Vincenty's formula (applied above)
  • 2461.643 miles
  • 3961.631 kilometers
  • 2139.110 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2453.909 miles
  • 3949.184 kilometers
  • 2132.389 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wabush to Campbell River?

The estimated flight time from Wabush Airport to Campbell River Airport is 5 hours and 9 minutes.

Flight carbon footprint between Wabush Airport (YWK) and Campbell River Airport (YBL)

On average, flying from Wabush to Campbell River generates about 271 kg of CO2 per passenger, and 271 kilograms equals 597 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wabush to Campbell River

See the map of the shortest flight path between Wabush Airport (YWK) and Campbell River Airport (YBL).

Airport information

Origin Wabush Airport
City: Wabush
Country: Canada Flag of Canada
IATA Code: YWK
ICAO Code: CYWK
Coordinates: 52°55′18″N, 66°51′51″W
Destination Campbell River Airport
City: Campbell River
Country: Canada Flag of Canada
IATA Code: YBL
ICAO Code: CYBL
Coordinates: 49°57′2″N, 125°16′15″W