Air Miles Calculator logo

How far is Shenzhen from Naha?

The distance between Naha (Naha Airport) and Shenzhen (Shenzhen Bao'an International Airport) is 905 miles / 1456 kilometers / 786 nautical miles.

The driving distance from Naha (OKA) to Shenzhen (SZX) is 3175 miles / 5110 kilometers, and travel time by car is about 186 hours 30 minutes.

Naha Airport – Shenzhen Bao'an International Airport

Distance arrow
905
Miles
Distance arrow
1456
Kilometers
Distance arrow
786
Nautical miles

Search flights

Distance from Naha to Shenzhen

There are several ways to calculate the distance from Naha to Shenzhen. Here are two standard methods:

Vincenty's formula (applied above)
  • 905.022 miles
  • 1456.492 kilometers
  • 786.443 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 903.863 miles
  • 1454.627 kilometers
  • 785.436 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Naha to Shenzhen?

The estimated flight time from Naha Airport to Shenzhen Bao'an International Airport is 2 hours and 12 minutes.

Flight carbon footprint between Naha Airport (OKA) and Shenzhen Bao'an International Airport (SZX)

On average, flying from Naha to Shenzhen generates about 144 kg of CO2 per passenger, and 144 kilograms equals 317 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Naha to Shenzhen

See the map of the shortest flight path between Naha Airport (OKA) and Shenzhen Bao'an International Airport (SZX).

Airport information

Origin Naha Airport
City: Naha
Country: Japan Flag of Japan
IATA Code: OKA
ICAO Code: ROAH
Coordinates: 26°11′44″N, 127°38′45″E
Destination Shenzhen Bao'an International Airport
City: Shenzhen
Country: China Flag of China
IATA Code: SZX
ICAO Code: ZGSZ
Coordinates: 22°38′21″N, 113°48′39″E