Air Miles Calculator logo

How far is London from Windsor?

The distance between Windsor (Windsor International Airport) and London (London International Airport) is 106 miles / 170 kilometers / 92 nautical miles.

The driving distance from Windsor (YQG) to London (YXU) is 122 miles / 197 kilometers, and travel time by car is about 2 hours 36 minutes.

Windsor International Airport – London International Airport

Distance arrow
106
Miles
Distance arrow
170
Kilometers
Distance arrow
92
Nautical miles

Search flights

Distance from Windsor to London

There are several ways to calculate the distance from Windsor to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 105.724 miles
  • 170.146 kilometers
  • 91.872 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 105.538 miles
  • 169.847 kilometers
  • 91.710 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor to London?

The estimated flight time from Windsor International Airport to London International Airport is 42 minutes.

What is the time difference between Windsor and London?

There is no time difference between Windsor and London.

Flight carbon footprint between Windsor International Airport (YQG) and London International Airport (YXU)

On average, flying from Windsor to London generates about 41 kg of CO2 per passenger, and 41 kilograms equals 89 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Windsor to London

See the map of the shortest flight path between Windsor International Airport (YQG) and London International Airport (YXU).

Airport information

Origin Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W