Air Miles Calculator logo

How far is Wuxi from Niigata?

The distance between Niigata (Niigata Airport) and Wuxi (Sunan Shuofang International Airport) is 1151 miles / 1852 kilometers / 1000 nautical miles.

The driving distance from Niigata (KIJ) to Wuxi (WUX) is 2482 miles / 3994 kilometers, and travel time by car is about 49 hours 52 minutes.

Niigata Airport – Sunan Shuofang International Airport

Distance arrow
1151
Miles
Distance arrow
1852
Kilometers
Distance arrow
1000
Nautical miles

Search flights

Distance from Niigata to Wuxi

There are several ways to calculate the distance from Niigata to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1150.599 miles
  • 1851.709 kilometers
  • 999.843 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1148.849 miles
  • 1848.893 kilometers
  • 998.322 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Niigata to Wuxi?

The estimated flight time from Niigata Airport to Sunan Shuofang International Airport is 2 hours and 40 minutes.

Flight carbon footprint between Niigata Airport (KIJ) and Sunan Shuofang International Airport (WUX)

On average, flying from Niigata to Wuxi generates about 159 kg of CO2 per passenger, and 159 kilograms equals 351 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Niigata to Wuxi

See the map of the shortest flight path between Niigata Airport (KIJ) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Niigata Airport
City: Niigata
Country: Japan Flag of Japan
IATA Code: KIJ
ICAO Code: RJSN
Coordinates: 37°57′21″N, 139°7′15″E
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E