Air Miles Calculator logo

How far is London from Campbell River?

The distance between Campbell River (Campbell River Airport) and London (London International Airport) is 2125 miles / 3420 kilometers / 1847 nautical miles.

The driving distance from Campbell River (YBL) to London (YXU) is 2626 miles / 4226 kilometers, and travel time by car is about 49 hours 41 minutes.

Campbell River Airport – London International Airport

Distance arrow
2125
Miles
Distance arrow
3420
Kilometers
Distance arrow
1847
Nautical miles

Search flights

Distance from Campbell River to London

There are several ways to calculate the distance from Campbell River to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2125.329 miles
  • 3420.385 kilometers
  • 1846.860 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2119.566 miles
  • 3411.111 kilometers
  • 1841.853 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Campbell River to London?

The estimated flight time from Campbell River Airport to London International Airport is 4 hours and 31 minutes.

Flight carbon footprint between Campbell River Airport (YBL) and London International Airport (YXU)

On average, flying from Campbell River to London generates about 232 kg of CO2 per passenger, and 232 kilograms equals 511 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Campbell River to London

See the map of the shortest flight path between Campbell River Airport (YBL) and London International Airport (YXU).

Airport information

Origin Campbell River Airport
City: Campbell River
Country: Canada Flag of Canada
IATA Code: YBL
ICAO Code: CYBL
Coordinates: 49°57′2″N, 125°16′15″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W