Air Miles Calculator logo

How far is London from St. Lewis?

The distance between St. Lewis (St. Lewis (Fox Harbour) Airport) and London (London International Airport) is 1342 miles / 2160 kilometers / 1166 nautical miles.

The driving distance from St. Lewis (YFX) to London (YXU) is 1819 miles / 2928 kilometers, and travel time by car is about 42 hours 32 minutes.

St. Lewis (Fox Harbour) Airport – London International Airport

Distance arrow
1342
Miles
Distance arrow
2160
Kilometers
Distance arrow
1166
Nautical miles
Flight time duration
3 h 2 min
Time Difference
1 h 30 min
CO2 emission
170 kg

Search flights

Distance from St. Lewis to London

There are several ways to calculate the distance from St. Lewis to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1341.938 miles
  • 2159.640 kilometers
  • 1166.112 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1338.936 miles
  • 2154.808 kilometers
  • 1163.503 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. Lewis to London?

The estimated flight time from St. Lewis (Fox Harbour) Airport to London International Airport is 3 hours and 2 minutes.

Flight carbon footprint between St. Lewis (Fox Harbour) Airport (YFX) and London International Airport (YXU)

On average, flying from St. Lewis to London generates about 170 kg of CO2 per passenger, and 170 kilograms equals 374 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from St. Lewis to London

See the map of the shortest flight path between St. Lewis (Fox Harbour) Airport (YFX) and London International Airport (YXU).

Airport information

Origin St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W