Air Miles Calculator logo

How far is Thunder Bay from Campbell River?

The distance between Campbell River (Campbell River Airport) and Thunder Bay (Thunder Bay International Airport) is 1617 miles / 2602 kilometers / 1405 nautical miles.

The driving distance from Campbell River (YBL) to Thunder Bay (YQT) is 2139 miles / 3443 kilometers, and travel time by car is about 40 hours 36 minutes.

Campbell River Airport – Thunder Bay International Airport

Distance arrow
1617
Miles
Distance arrow
2602
Kilometers
Distance arrow
1405
Nautical miles

Search flights

Distance from Campbell River to Thunder Bay

There are several ways to calculate the distance from Campbell River to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1617.032 miles
  • 2602.361 kilometers
  • 1405.163 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1612.150 miles
  • 2594.504 kilometers
  • 1400.920 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Campbell River to Thunder Bay?

The estimated flight time from Campbell River Airport to Thunder Bay International Airport is 3 hours and 33 minutes.

Flight carbon footprint between Campbell River Airport (YBL) and Thunder Bay International Airport (YQT)

On average, flying from Campbell River to Thunder Bay generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Campbell River to Thunder Bay

See the map of the shortest flight path between Campbell River Airport (YBL) and Thunder Bay International Airport (YQT).

Airport information

Origin Campbell River Airport
City: Campbell River
Country: Canada Flag of Canada
IATA Code: YBL
ICAO Code: CYBL
Coordinates: 49°57′2″N, 125°16′15″W
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W