Air Miles Calculator logo

How far is Ushuaia from Chita?

The distance between Chita (Chita-Kadala International Airport) and Ushuaia (Ushuaia – Malvinas Argentinas International Airport) is 12225 miles / 19674 kilometers / 10623 nautical miles.

Chita-Kadala International Airport – Ushuaia – Malvinas Argentinas International Airport

Distance arrow
12225
Miles
Distance arrow
19674
Kilometers
Distance arrow
10623
Nautical miles
Flight time duration
23 h 38 min
CO2 emission
1 664 kg

Search flights

Distance from Chita to Ushuaia

There are several ways to calculate the distance from Chita to Ushuaia. Here are two standard methods:

Vincenty's formula (applied above)
  • 12224.866 miles
  • 19674.015 kilometers
  • 10623.118 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 12231.329 miles
  • 19684.417 kilometers
  • 10628.735 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chita to Ushuaia?

The estimated flight time from Chita-Kadala International Airport to Ushuaia – Malvinas Argentinas International Airport is 23 hours and 38 minutes.

Flight carbon footprint between Chita-Kadala International Airport (HTA) and Ushuaia – Malvinas Argentinas International Airport (USH)

On average, flying from Chita to Ushuaia generates about 1 664 kg of CO2 per passenger, and 1 664 kilograms equals 3 668 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Chita to Ushuaia

See the map of the shortest flight path between Chita-Kadala International Airport (HTA) and Ushuaia – Malvinas Argentinas International Airport (USH).

Airport information

Origin Chita-Kadala International Airport
City: Chita
Country: Russia Flag of Russia
IATA Code: HTA
ICAO Code: UIAA
Coordinates: 52°1′34″N, 113°18′21″E
Destination Ushuaia – Malvinas Argentinas International Airport
City: Ushuaia
Country: Argentina Flag of Argentina
IATA Code: USH
ICAO Code: SAWH
Coordinates: 54°50′35″S, 68°17′44″W