Air Miles Calculator logo

How far is Buenos Aires from Johannesburg?

The distance between Johannesburg (Lanseria International Airport) and Buenos Aires (Aeroparque Jorge Newbery) is 5041 miles / 8112 kilometers / 4380 nautical miles.

Lanseria International Airport – Aeroparque Jorge Newbery

Distance arrow
5041
Miles
Distance arrow
8112
Kilometers
Distance arrow
4380
Nautical miles

Search flights

Distance from Johannesburg to Buenos Aires

There are several ways to calculate the distance from Johannesburg to Buenos Aires. Here are two standard methods:

Vincenty's formula (applied above)
  • 5040.807 miles
  • 8112.393 kilometers
  • 4380.342 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5031.379 miles
  • 8097.220 kilometers
  • 4372.149 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Johannesburg to Buenos Aires?

The estimated flight time from Lanseria International Airport to Aeroparque Jorge Newbery is 10 hours and 2 minutes.

Flight carbon footprint between Lanseria International Airport (HLA) and Aeroparque Jorge Newbery (AEP)

On average, flying from Johannesburg to Buenos Aires generates about 589 kg of CO2 per passenger, and 589 kilograms equals 1 299 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Johannesburg to Buenos Aires

See the map of the shortest flight path between Lanseria International Airport (HLA) and Aeroparque Jorge Newbery (AEP).

Airport information

Origin Lanseria International Airport
City: Johannesburg
Country: South Africa Flag of South Africa
IATA Code: HLA
ICAO Code: FALA
Coordinates: 25°56′18″S, 27°55′33″E
Destination Aeroparque Jorge Newbery
City: Buenos Aires
Country: Argentina Flag of Argentina
IATA Code: AEP
ICAO Code: SABE
Coordinates: 34°33′33″S, 58°24′56″W