Air Miles Calculator logo

How far is Ushuaia from Barrow, AK?

The distance between Barrow (Wiley Post–Will Rogers Memorial Airport) and Ushuaia (Ushuaia – Malvinas Argentinas International Airport) is 9679 miles / 15577 kilometers / 8411 nautical miles.

Wiley Post–Will Rogers Memorial Airport – Ushuaia – Malvinas Argentinas International Airport

Distance arrow
9679
Miles
Distance arrow
15577
Kilometers
Distance arrow
8411
Nautical miles
Flight time duration
18 h 49 min
CO2 emission
1 251 kg

Search flights

Distance from Barrow to Ushuaia

There are several ways to calculate the distance from Barrow to Ushuaia. Here are two standard methods:

Vincenty's formula (applied above)
  • 9679.145 miles
  • 15577.074 kilometers
  • 8410.947 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9693.942 miles
  • 15600.887 kilometers
  • 8423.805 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Barrow to Ushuaia?

The estimated flight time from Wiley Post–Will Rogers Memorial Airport to Ushuaia – Malvinas Argentinas International Airport is 18 hours and 49 minutes.

Flight carbon footprint between Wiley Post–Will Rogers Memorial Airport (BRW) and Ushuaia – Malvinas Argentinas International Airport (USH)

On average, flying from Barrow to Ushuaia generates about 1 251 kg of CO2 per passenger, and 1 251 kilograms equals 2 759 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Barrow to Ushuaia

See the map of the shortest flight path between Wiley Post–Will Rogers Memorial Airport (BRW) and Ushuaia – Malvinas Argentinas International Airport (USH).

Airport information

Origin Wiley Post–Will Rogers Memorial Airport
City: Barrow, AK
Country: United States Flag of United States
IATA Code: BRW
ICAO Code: PABR
Coordinates: 71°17′7″N, 156°45′57″W
Destination Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W