Air Miles Calculator logo

How far is Ushuaia from London?

The distance between London (London Heathrow Airport) and Ushuaia (Ushuaia – Malvinas Argentinas International Airport) is 8293 miles / 13347 kilometers / 7207 nautical miles.

London Heathrow Airport – Ushuaia – Malvinas Argentinas International Airport

Distance arrow
8293
Miles
Distance arrow
13347
Kilometers
Distance arrow
7207
Nautical miles
Flight time duration
16 h 12 min
CO2 emission
1 042 kg

Search flights

Distance from London to Ushuaia

There are several ways to calculate the distance from London to Ushuaia. Here are two standard methods:

Vincenty's formula (applied above)
  • 8293.492 miles
  • 13347.082 kilometers
  • 7206.848 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8310.686 miles
  • 13374.753 kilometers
  • 7221.789 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from London to Ushuaia?

The estimated flight time from London Heathrow Airport to Ushuaia – Malvinas Argentinas International Airport is 16 hours and 12 minutes.

Flight carbon footprint between London Heathrow Airport (LHR) and Ushuaia – Malvinas Argentinas International Airport (USH)

On average, flying from London to Ushuaia generates about 1 042 kg of CO2 per passenger, and 1 042 kilograms equals 2 296 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from London to Ushuaia

See the map of the shortest flight path between London Heathrow Airport (LHR) and Ushuaia – Malvinas Argentinas International Airport (USH).

Airport information

Origin London Heathrow Airport
City: London
Country: United Kingdom Flag of United Kingdom
IATA Code: LHR
ICAO Code: EGLL
Coordinates: 51°28′14″N, 0°27′42″W
Destination Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W