Air Miles Calculator logo

How far is Buenos Aires from Tokyo?

The distance between Tokyo (Narita International Airport) and Buenos Aires (Aeroparque Jorge Newbery) is 11378 miles / 18311 kilometers / 9887 nautical miles.

Narita International Airport – Aeroparque Jorge Newbery

Distance arrow
11378
Miles
Distance arrow
18311
Kilometers
Distance arrow
9887
Nautical miles
Flight time duration
22 h 2 min
CO2 emission
1 523 kg

Search flights

Distance from Tokyo to Buenos Aires

There are several ways to calculate the distance from Tokyo to Buenos Aires. Here are two standard methods:

Vincenty's formula (applied above)
  • 11377.636 miles
  • 18310.530 kilometers
  • 9886.895 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 11373.166 miles
  • 18303.337 kilometers
  • 9883.011 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tokyo to Buenos Aires?

The estimated flight time from Narita International Airport to Aeroparque Jorge Newbery is 22 hours and 2 minutes.

Flight carbon footprint between Narita International Airport (NRT) and Aeroparque Jorge Newbery (AEP)

On average, flying from Tokyo to Buenos Aires generates about 1 523 kg of CO2 per passenger, and 1 523 kilograms equals 3 357 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Tokyo to Buenos Aires

See the map of the shortest flight path between Narita International Airport (NRT) and Aeroparque Jorge Newbery (AEP).

Airport information

Origin Narita International Airport
City: Tokyo
Country: Japan Flag of Japan
IATA Code: NRT
ICAO Code: RJAA
Coordinates: 35°45′52″N, 140°23′9″E
Destination Aeroparque Jorge Newbery
City: Buenos Aires
Country: Argentina Flag of Argentina
IATA Code: AEP
ICAO Code: SABE
Coordinates: 34°33′33″S, 58°24′56″W