Air Miles Calculator logo

How far is Wuxi from Huaihua?

The distance between Huaihua (Huaihua Zhijiang Airport) and Wuxi (Sunan Shuofang International Airport) is 704 miles / 1133 kilometers / 612 nautical miles.

The driving distance from Huaihua (HJJ) to Wuxi (WUX) is 874 miles / 1406 kilometers, and travel time by car is about 15 hours 58 minutes.

Huaihua Zhijiang Airport – Sunan Shuofang International Airport

Distance arrow
704
Miles
Distance arrow
1133
Kilometers
Distance arrow
612
Nautical miles

Search flights

Distance from Huaihua to Wuxi

There are several ways to calculate the distance from Huaihua to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 703.884 miles
  • 1132.792 kilometers
  • 611.659 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 703.093 miles
  • 1131.519 kilometers
  • 610.972 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Huaihua to Wuxi?

The estimated flight time from Huaihua Zhijiang Airport to Sunan Shuofang International Airport is 1 hour and 49 minutes.

What is the time difference between Huaihua and Wuxi?

There is no time difference between Huaihua and Wuxi.

Flight carbon footprint between Huaihua Zhijiang Airport (HJJ) and Sunan Shuofang International Airport (WUX)

On average, flying from Huaihua to Wuxi generates about 125 kg of CO2 per passenger, and 125 kilograms equals 276 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Huaihua to Wuxi

See the map of the shortest flight path between Huaihua Zhijiang Airport (HJJ) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E