Air Miles Calculator logo

How far is London from Chevery?

The distance between Chevery (Chevery Airport) and London (London International Airport) is 1138 miles / 1831 kilometers / 989 nautical miles.

The driving distance from Chevery (YHR) to London (YXU) is 1379 miles / 2219 kilometers, and travel time by car is about 64 hours 29 minutes.

Chevery Airport – London International Airport

Distance arrow
1138
Miles
Distance arrow
1831
Kilometers
Distance arrow
989
Nautical miles

Search flights

Distance from Chevery to London

There are several ways to calculate the distance from Chevery to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1137.665 miles
  • 1830.894 kilometers
  • 988.603 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1135.110 miles
  • 1826.783 kilometers
  • 986.384 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chevery to London?

The estimated flight time from Chevery Airport to London International Airport is 2 hours and 39 minutes.

What is the time difference between Chevery and London?

There is no time difference between Chevery and London.

Flight carbon footprint between Chevery Airport (YHR) and London International Airport (YXU)

On average, flying from Chevery to London generates about 159 kg of CO2 per passenger, and 159 kilograms equals 350 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Chevery to London

See the map of the shortest flight path between Chevery Airport (YHR) and London International Airport (YXU).

Airport information

Origin Chevery Airport
City: Chevery
Country: Canada Flag of Canada
IATA Code: YHR
ICAO Code: CYHR
Coordinates: 50°28′8″N, 59°38′12″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W