Air Miles Calculator logo

How far is London from Hamilton?

The distance between Hamilton (John C. Munro Hamilton International Airport) and London (London International Airport) is 62 miles / 100 kilometers / 54 nautical miles.

The driving distance from Hamilton (YHM) to London (YXU) is 75 miles / 121 kilometers, and travel time by car is about 1 hour 42 minutes.

John C. Munro Hamilton International Airport – London International Airport

Distance arrow
62
Miles
Distance arrow
100
Kilometers
Distance arrow
54
Nautical miles

Search flights

Distance from Hamilton to London

There are several ways to calculate the distance from Hamilton to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 62.385 miles
  • 100.398 kilometers
  • 54.211 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 62.223 miles
  • 100.138 kilometers
  • 54.070 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamilton to London?

The estimated flight time from John C. Munro Hamilton International Airport to London International Airport is 37 minutes.

What is the time difference between Hamilton and London?

There is no time difference between Hamilton and London.

Flight carbon footprint between John C. Munro Hamilton International Airport (YHM) and London International Airport (YXU)

On average, flying from Hamilton to London generates about 34 kg of CO2 per passenger, and 34 kilograms equals 75 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hamilton to London

See the map of the shortest flight path between John C. Munro Hamilton International Airport (YHM) and London International Airport (YXU).

Airport information

Origin John C. Munro Hamilton International Airport
City: Hamilton
Country: Canada Flag of Canada
IATA Code: YHM
ICAO Code: CYHM
Coordinates: 43°10′24″N, 79°56′5″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W