Air Miles Calculator logo

How far is London from Friday Harbor, WA?

The distance between Friday Harbor (Friday Harbor Airport) and London (London International Airport) is 2032 miles / 3270 kilometers / 1766 nautical miles.

The driving distance from Friday Harbor (FRD) to London (YXU) is 2449 miles / 3942 kilometers, and travel time by car is about 45 hours 46 minutes.

Friday Harbor Airport – London International Airport

Distance arrow
2032
Miles
Distance arrow
3270
Kilometers
Distance arrow
1766
Nautical miles

Search flights

Distance from Friday Harbor to London

There are several ways to calculate the distance from Friday Harbor to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2032.106 miles
  • 3270.358 kilometers
  • 1765.852 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2026.582 miles
  • 3261.467 kilometers
  • 1761.051 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Friday Harbor to London?

The estimated flight time from Friday Harbor Airport to London International Airport is 4 hours and 20 minutes.

Flight carbon footprint between Friday Harbor Airport (FRD) and London International Airport (YXU)

On average, flying from Friday Harbor to London generates about 221 kg of CO2 per passenger, and 221 kilograms equals 488 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Friday Harbor to London

See the map of the shortest flight path between Friday Harbor Airport (FRD) and London International Airport (YXU).

Airport information

Origin Friday Harbor Airport
City: Friday Harbor, WA
Country: United States Flag of United States
IATA Code: FRD
ICAO Code: KFHR
Coordinates: 48°31′19″N, 123°1′26″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W