Air Miles Calculator logo

How far is London from Fort St.John?

The distance between Fort St.John (Fort St. John Airport) and London (London International Airport) is 1961 miles / 3155 kilometers / 1704 nautical miles.

The driving distance from Fort St.John (YXJ) to London (YXU) is 2384 miles / 3836 kilometers, and travel time by car is about 45 hours 45 minutes.

Fort St. John Airport – London International Airport

Distance arrow
1961
Miles
Distance arrow
3155
Kilometers
Distance arrow
1704
Nautical miles

Search flights

Distance from Fort St.John to London

There are several ways to calculate the distance from Fort St.John to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1960.656 miles
  • 3155.371 kilometers
  • 1703.764 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1955.888 miles
  • 3147.697 kilometers
  • 1699.620 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fort St.John to London?

The estimated flight time from Fort St. John Airport to London International Airport is 4 hours and 12 minutes.

Flight carbon footprint between Fort St. John Airport (YXJ) and London International Airport (YXU)

On average, flying from Fort St.John to London generates about 214 kg of CO2 per passenger, and 214 kilograms equals 472 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fort St.John to London

See the map of the shortest flight path between Fort St. John Airport (YXJ) and London International Airport (YXU).

Airport information

Origin Fort St. John Airport
City: Fort St.John
Country: Canada Flag of Canada
IATA Code: YXJ
ICAO Code: CYXJ
Coordinates: 56°14′17″N, 120°44′23″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W