Air Miles Calculator logo

How far is Shihezi from Jiujiang?

The distance between Jiujiang (Jiujiang Lushan Airport) and Shihezi (Shihezi Huayuan Airport) is 1926 miles / 3099 kilometers / 1673 nautical miles.

The driving distance from Jiujiang (JIU) to Shihezi (SHF) is 2250 miles / 3621 kilometers, and travel time by car is about 40 hours 37 minutes.

Jiujiang Lushan Airport – Shihezi Huayuan Airport

Distance arrow
1926
Miles
Distance arrow
3099
Kilometers
Distance arrow
1673
Nautical miles

Search flights

Distance from Jiujiang to Shihezi

There are several ways to calculate the distance from Jiujiang to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1925.750 miles
  • 3099.195 kilometers
  • 1673.431 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1923.504 miles
  • 3095.579 kilometers
  • 1671.479 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jiujiang to Shihezi?

The estimated flight time from Jiujiang Lushan Airport to Shihezi Huayuan Airport is 4 hours and 8 minutes.

Flight carbon footprint between Jiujiang Lushan Airport (JIU) and Shihezi Huayuan Airport (SHF)

On average, flying from Jiujiang to Shihezi generates about 211 kg of CO2 per passenger, and 211 kilograms equals 464 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jiujiang to Shihezi

See the map of the shortest flight path between Jiujiang Lushan Airport (JIU) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Jiujiang Lushan Airport
City: Jiujiang
Country: China Flag of China
IATA Code: JIU
ICAO Code: ZSJJ
Coordinates: 29°43′58″N, 115°58′58″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E