Air Miles Calculator logo

Distance between Jiujiang (JIU) and Shenzhen (SZX)

Flight distance from Jiujiang to Shenzhen (Jiujiang Lushan Airport – Shenzhen Bao'an International Airport) is 507 miles / 815 kilometers / 440 nautical miles. Estimated flight time is 1 hour 27 minutes.

Driving distance from Jiujiang (JIU) to Shenzhen (SZX) is 592 miles / 952 kilometers and travel time by car is about 10 hours 41 minutes.

Jiujiang – Shenzhen

Distance arrow
507
Miles
Distance arrow
815
Kilometers
Distance arrow
440
Nautical miles

How far is Shenzhen from Jiujiang?

There are several ways to calculate distances between Los Angeles and Chicago. Here are two common methods:

Vincenty's formula (applied above)
  • 506.602 miles
  • 815.296 kilometers
  • 440.225 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
  • 508.253 miles
  • 817.953 kilometers
  • 441.659 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

Flight Duration

Estimated flight time from Jiujiang Lushan Airport to Shenzhen Bao'an International Airport is 1 hour 27 minutes.

Time difference

There is no time difference between Jiujiang and Shenzhen.

Carbon dioxide emissions

On average flying from Jiujiang to Shenzhen generates about 100 kg of CO2 per passenger, 100 kilograms is equal to 220 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jiujiang to Shenzhen

Shortest flight path between Jiujiang Lushan Airport (JIU) and Shenzhen Bao'an International Airport (SZX).

Airport information

Origin Jiujiang Lushan Airport
City: Jiujiang
Country: China Flag of China
IATA Code: JIU
ICAO Code: ZSJJ
Coordinates: 29°43′58″N, 115°58′58″E
Destination Shenzhen Bao'an International Airport
City: Shenzhen
Country: China Flag of China
IATA Code: SZX
ICAO Code: ZGSZ
Coordinates: 22°38′21″N, 113°48′39″E