Air Miles Calculator logo

How far is Ushuaia from Cartagena?

The distance between Cartagena (Rafael Núñez International Airport) and Ushuaia (Ushuaia – Malvinas Argentinas International Airport) is 4515 miles / 7266 kilometers / 3923 nautical miles.

The driving distance from Cartagena (CTG) to Ushuaia (USH) is 6398 miles / 10296 kilometers, and travel time by car is about 134 hours 24 minutes.

Rafael Núñez International Airport – Ushuaia – Malvinas Argentinas International Airport

Distance arrow
4515
Miles
Distance arrow
7266
Kilometers
Distance arrow
3923
Nautical miles

Search flights

Distance from Cartagena to Ushuaia

There are several ways to calculate the distance from Cartagena to Ushuaia. Here are two standard methods:

Vincenty's formula (applied above)
  • 4515.020 miles
  • 7266.221 kilometers
  • 3923.446 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4530.342 miles
  • 7290.879 kilometers
  • 3936.760 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cartagena to Ushuaia?

The estimated flight time from Rafael Núñez International Airport to Ushuaia – Malvinas Argentinas International Airport is 9 hours and 2 minutes.

Flight carbon footprint between Rafael Núñez International Airport (CTG) and Ushuaia – Malvinas Argentinas International Airport (USH)

On average, flying from Cartagena to Ushuaia generates about 521 kg of CO2 per passenger, and 521 kilograms equals 1 149 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Cartagena to Ushuaia

See the map of the shortest flight path between Rafael Núñez International Airport (CTG) and Ushuaia – Malvinas Argentinas International Airport (USH).

Airport information

Origin Rafael Núñez International Airport
City: Cartagena
Country: Colombia Flag of Colombia
IATA Code: CTG
ICAO Code: SKCG
Coordinates: 10°26′32″N, 75°30′46″W
Destination Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W