Air Miles Calculator logo

How far is London from Fort Simpson?

The distance between Fort Simpson (Fort Simpson Airport) and London (London International Airport) is 2079 miles / 3346 kilometers / 1807 nautical miles.

The driving distance from Fort Simpson (YFS) to London (YXU) is 2865 miles / 4610 kilometers, and travel time by car is about 58 hours 29 minutes.

Fort Simpson Airport – London International Airport

Distance arrow
2079
Miles
Distance arrow
3346
Kilometers
Distance arrow
1807
Nautical miles

Search flights

Distance from Fort Simpson to London

There are several ways to calculate the distance from Fort Simpson to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2078.893 miles
  • 3345.654 kilometers
  • 1806.509 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2074.297 miles
  • 3338.258 kilometers
  • 1802.515 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fort Simpson to London?

The estimated flight time from Fort Simpson Airport to London International Airport is 4 hours and 26 minutes.

Flight carbon footprint between Fort Simpson Airport (YFS) and London International Airport (YXU)

On average, flying from Fort Simpson to London generates about 226 kg of CO2 per passenger, and 226 kilograms equals 499 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fort Simpson to London

See the map of the shortest flight path between Fort Simpson Airport (YFS) and London International Airport (YXU).

Airport information

Origin Fort Simpson Airport
City: Fort Simpson
Country: Canada Flag of Canada
IATA Code: YFS
ICAO Code: CYFS
Coordinates: 61°45′36″N, 121°14′13″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W