Air Miles Calculator logo

How far is Wuxi from Jeju?

The distance between Jeju (Jeju International Airport) and Wuxi (Sunan Shuofang International Airport) is 380 miles / 612 kilometers / 330 nautical miles.

The driving distance from Jeju (CJU) to Wuxi (WUX) is 1723 miles / 2773 kilometers, and travel time by car is about 32 hours 22 minutes.

Jeju International Airport – Sunan Shuofang International Airport

Distance arrow
380
Miles
Distance arrow
612
Kilometers
Distance arrow
330
Nautical miles

Search flights

Distance from Jeju to Wuxi

There are several ways to calculate the distance from Jeju to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 380.308 miles
  • 612.046 kilometers
  • 330.479 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 379.759 miles
  • 611.163 kilometers
  • 330.002 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jeju to Wuxi?

The estimated flight time from Jeju International Airport to Sunan Shuofang International Airport is 1 hour and 13 minutes.

Flight carbon footprint between Jeju International Airport (CJU) and Sunan Shuofang International Airport (WUX)

On average, flying from Jeju to Wuxi generates about 81 kg of CO2 per passenger, and 81 kilograms equals 179 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jeju to Wuxi

See the map of the shortest flight path between Jeju International Airport (CJU) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Jeju International Airport
City: Jeju
Country: South Korea Flag of South Korea
IATA Code: CJU
ICAO Code: RKPC
Coordinates: 33°30′40″N, 126°29′34″E
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E