Air Miles Calculator logo

How far is Shanghai from Naha?

The distance between Naha (Naha Airport) and Shanghai (Shanghai Hongqiao International Airport) is 515 miles / 829 kilometers / 448 nautical miles.

The driving distance from Naha (OKA) to Shanghai (SHA) is 2484 miles / 3998 kilometers, and travel time by car is about 173 hours 59 minutes.

Naha Airport – Shanghai Hongqiao International Airport

Distance arrow
515
Miles
Distance arrow
829
Kilometers
Distance arrow
448
Nautical miles

Search flights

Distance from Naha to Shanghai

There are several ways to calculate the distance from Naha to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 515.036 miles
  • 828.870 kilometers
  • 447.554 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 515.257 miles
  • 829.225 kilometers
  • 447.746 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Naha to Shanghai?

The estimated flight time from Naha Airport to Shanghai Hongqiao International Airport is 1 hour and 28 minutes.

Flight carbon footprint between Naha Airport (OKA) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Naha to Shanghai generates about 101 kg of CO2 per passenger, and 101 kilograms equals 222 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Naha to Shanghai

See the map of the shortest flight path between Naha Airport (OKA) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Naha Airport
City: Naha
Country: Japan Flag of Japan
IATA Code: OKA
ICAO Code: ROAH
Coordinates: 26°11′44″N, 127°38′45″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E