Air Miles Calculator logo

How far is Wuxi from Chicago, IL?

The distance between Chicago (Chicago O'Hare International Airport) and Wuxi (Sunan Shuofang International Airport) is 7064 miles / 11369 kilometers / 6139 nautical miles.

Chicago O'Hare International Airport – Sunan Shuofang International Airport

Distance arrow
7064
Miles
Distance arrow
11369
Kilometers
Distance arrow
6139
Nautical miles

Search flights

Distance from Chicago to Wuxi

There are several ways to calculate the distance from Chicago to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 7064.489 miles
  • 11369.193 kilometers
  • 6138.873 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7050.015 miles
  • 11345.900 kilometers
  • 6126.296 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chicago to Wuxi?

The estimated flight time from Chicago O'Hare International Airport to Sunan Shuofang International Airport is 13 hours and 52 minutes.

Flight carbon footprint between Chicago O'Hare International Airport (ORD) and Sunan Shuofang International Airport (WUX)

On average, flying from Chicago to Wuxi generates about 864 kg of CO2 per passenger, and 864 kilograms equals 1 905 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Chicago to Wuxi

See the map of the shortest flight path between Chicago O'Hare International Airport (ORD) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Chicago O'Hare International Airport
City: Chicago, IL
Country: United States Flag of United States
IATA Code: ORD
ICAO Code: KORD
Coordinates: 41°58′42″N, 87°54′17″W
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E