Air Miles Calculator logo

How far is London from Bearskin Lake?

The distance between Bearskin Lake (Bearskin Lake Airport) and London (London International Airport) is 879 miles / 1415 kilometers / 764 nautical miles.

The driving distance from Bearskin Lake (XBE) to London (YXU) is 1328 miles / 2138 kilometers, and travel time by car is about 35 hours 10 minutes.

Bearskin Lake Airport – London International Airport

Distance arrow
879
Miles
Distance arrow
1415
Kilometers
Distance arrow
764
Nautical miles

Search flights

Distance from Bearskin Lake to London

There are several ways to calculate the distance from Bearskin Lake to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 878.979 miles
  • 1414.579 kilometers
  • 763.812 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 878.266 miles
  • 1413.433 kilometers
  • 763.193 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bearskin Lake to London?

The estimated flight time from Bearskin Lake Airport to London International Airport is 2 hours and 9 minutes.

Flight carbon footprint between Bearskin Lake Airport (XBE) and London International Airport (YXU)

On average, flying from Bearskin Lake to London generates about 142 kg of CO2 per passenger, and 142 kilograms equals 313 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bearskin Lake to London

See the map of the shortest flight path between Bearskin Lake Airport (XBE) and London International Airport (YXU).

Airport information

Origin Bearskin Lake Airport
City: Bearskin Lake
Country: Canada Flag of Canada
IATA Code: XBE
ICAO Code: CNE3
Coordinates: 53°57′56″N, 91°1′37″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W