Air Miles Calculator logo

How far is London from Fort Mcpherson?

The distance between Fort Mcpherson (Fort McPherson Airport) and London (London International Airport) is 2578 miles / 4149 kilometers / 2240 nautical miles.

The driving distance from Fort Mcpherson (ZFM) to London (YXU) is 3857 miles / 6207 kilometers, and travel time by car is about 82 hours 50 minutes.

Fort McPherson Airport – London International Airport

Distance arrow
2578
Miles
Distance arrow
4149
Kilometers
Distance arrow
2240
Nautical miles

Search flights

Distance from Fort Mcpherson to London

There are several ways to calculate the distance from Fort Mcpherson to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2577.925 miles
  • 4148.768 kilometers
  • 2240.155 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2571.822 miles
  • 4138.947 kilometers
  • 2234.853 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fort Mcpherson to London?

The estimated flight time from Fort McPherson Airport to London International Airport is 5 hours and 22 minutes.

Flight carbon footprint between Fort McPherson Airport (ZFM) and London International Airport (YXU)

On average, flying from Fort Mcpherson to London generates about 284 kg of CO2 per passenger, and 284 kilograms equals 627 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fort Mcpherson to London

See the map of the shortest flight path between Fort McPherson Airport (ZFM) and London International Airport (YXU).

Airport information

Origin Fort McPherson Airport
City: Fort Mcpherson
Country: Canada Flag of Canada
IATA Code: ZFM
ICAO Code: CZFM
Coordinates: 67°24′27″N, 134°51′39″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W