How far is Ushuaia from Tampa, FL?
The distance between Tampa (Tampa International Airport) and Ushuaia (Ushuaia – Malvinas Argentinas International Airport) is 5764 miles / 9276 kilometers / 5009 nautical miles.
Tampa International Airport – Ushuaia – Malvinas Argentinas International Airport
Search flights
Distance from Tampa to Ushuaia
There are several ways to calculate the distance from Tampa to Ushuaia. Here are two standard methods:
Vincenty's formula (applied above)- 5764.094 miles
- 9276.411 kilometers
- 5008.861 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5784.498 miles
- 9309.247 kilometers
- 5026.591 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tampa to Ushuaia?
The estimated flight time from Tampa International Airport to Ushuaia – Malvinas Argentinas International Airport is 11 hours and 24 minutes.
What is the time difference between Tampa and Ushuaia?
The time difference between Tampa and Ushuaia is 2 hours. Ushuaia is 2 hours ahead of Tampa.
Flight carbon footprint between Tampa International Airport (TPA) and Ushuaia – Malvinas Argentinas International Airport (USH)
On average, flying from Tampa to Ushuaia generates about 685 kg of CO2 per passenger, and 685 kilograms equals 1 510 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Tampa to Ushuaia
See the map of the shortest flight path between Tampa International Airport (TPA) and Ushuaia – Malvinas Argentinas International Airport (USH).
Airport information
Origin | Tampa International Airport |
---|---|
City: | Tampa, FL |
Country: | United States |
IATA Code: | TPA |
ICAO Code: | KTPA |
Coordinates: | 27°58′31″N, 82°31′59″W |
Destination | Ushuaia – Malvinas Argentinas International Airport |
---|---|
City: | Ushuaia |
Country: | Argentina |
IATA Code: | USH |
ICAO Code: | SAWH |
Coordinates: | 54°50′35″S, 68°17′44″W |