Air Miles Calculator logo

How far is London from Havana?

The distance between Havana (José Martí International Airport) and London (London International Airport) is 1383 miles / 2226 kilometers / 1202 nautical miles.

José Martí International Airport – London International Airport

Distance arrow
1383
Miles
Distance arrow
2226
Kilometers
Distance arrow
1202
Nautical miles

Search flights

Distance from Havana to London

There are several ways to calculate the distance from Havana to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1383.391 miles
  • 2226.352 kilometers
  • 1202.134 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1386.938 miles
  • 2232.060 kilometers
  • 1205.216 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Havana to London?

The estimated flight time from José Martí International Airport to London International Airport is 3 hours and 7 minutes.

What is the time difference between Havana and London?

There is no time difference between Havana and London.

Flight carbon footprint between José Martí International Airport (HAV) and London International Airport (YXU)

On average, flying from Havana to London generates about 172 kg of CO2 per passenger, and 172 kilograms equals 380 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Havana to London

See the map of the shortest flight path between José Martí International Airport (HAV) and London International Airport (YXU).

Airport information

Origin José Martí International Airport
City: Havana
Country: Cuba Flag of Cuba
IATA Code: HAV
ICAO Code: MUHA
Coordinates: 22°59′21″N, 82°24′32″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W