Air Miles Calculator logo

How far is Ushuaia from Longyearbyen?

The distance between Longyearbyen (Svalbard Airport, Longyear) and Ushuaia (Ushuaia – Malvinas Argentinas International Airport) is 9794 miles / 15762 kilometers / 8511 nautical miles.

Svalbard Airport, Longyear – Ushuaia – Malvinas Argentinas International Airport

Distance arrow
9794
Miles
Distance arrow
15762
Kilometers
Distance arrow
8511
Nautical miles

Search flights

Distance from Longyearbyen to Ushuaia

There are several ways to calculate the distance from Longyearbyen to Ushuaia. Here are two standard methods:

Vincenty's formula (applied above)
  • 9793.978 miles
  • 15761.879 kilometers
  • 8510.734 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9809.205 miles
  • 15786.386 kilometers
  • 8523.966 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Longyearbyen to Ushuaia?

The estimated flight time from Svalbard Airport, Longyear to Ushuaia – Malvinas Argentinas International Airport is 19 hours and 2 minutes.

Flight carbon footprint between Svalbard Airport, Longyear (LYR) and Ushuaia – Malvinas Argentinas International Airport (USH)

On average, flying from Longyearbyen to Ushuaia generates about 1 269 kg of CO2 per passenger, and 1 269 kilograms equals 2 798 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Longyearbyen to Ushuaia

See the map of the shortest flight path between Svalbard Airport, Longyear (LYR) and Ushuaia – Malvinas Argentinas International Airport (USH).

Airport information

Origin Svalbard Airport, Longyear
City: Longyearbyen
Country: Norway Flag of Norway
IATA Code: LYR
ICAO Code: ENSB
Coordinates: 78°14′45″N, 15°27′56″E
Destination Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W