Air Miles Calculator logo

How far is Juliaca from Havana?

The distance between Havana (José Martí International Airport) and Juliaca (Inca Manco Cápac International Airport) is 2770 miles / 4458 kilometers / 2407 nautical miles.

José Martí International Airport – Inca Manco Cápac International Airport

Distance arrow
2770
Miles
Distance arrow
4458
Kilometers
Distance arrow
2407
Nautical miles

Search flights

Distance from Havana to Juliaca

There are several ways to calculate the distance from Havana to Juliaca. Here are two standard methods:

Vincenty's formula (applied above)
  • 2770.362 miles
  • 4458.465 kilometers
  • 2407.378 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2783.179 miles
  • 4479.093 kilometers
  • 2418.517 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Havana to Juliaca?

The estimated flight time from José Martí International Airport to Inca Manco Cápac International Airport is 5 hours and 44 minutes.

Flight carbon footprint between José Martí International Airport (HAV) and Inca Manco Cápac International Airport (JUL)

On average, flying from Havana to Juliaca generates about 307 kg of CO2 per passenger, and 307 kilograms equals 677 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Havana to Juliaca

See the map of the shortest flight path between José Martí International Airport (HAV) and Inca Manco Cápac International Airport (JUL).

Airport information

Origin José Martí International Airport
City: Havana
Country: Cuba Flag of Cuba
IATA Code: HAV
ICAO Code: MUHA
Coordinates: 22°59′21″N, 82°24′32″W
Destination Inca Manco Cápac International Airport
City: Juliaca
Country: Perú Flag of Perú
IATA Code: JUL
ICAO Code: SPJL
Coordinates: 15°28′1″S, 70°9′29″W