Air Miles Calculator logo

How far is Wuxi from Tacheng?

The distance between Tacheng (Tacheng Airport) and Wuxi (Sunan Shuofang International Airport) is 2224 miles / 3579 kilometers / 1933 nautical miles.

The driving distance from Tacheng (TCG) to Wuxi (WUX) is 2653 miles / 4270 kilometers, and travel time by car is about 48 hours 5 minutes.

Tacheng Airport – Sunan Shuofang International Airport

Distance arrow
2224
Miles
Distance arrow
3579
Kilometers
Distance arrow
1933
Nautical miles

Search flights

Distance from Tacheng to Wuxi

There are several ways to calculate the distance from Tacheng to Wuxi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2224.037 miles
  • 3579.240 kilometers
  • 1932.635 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2220.609 miles
  • 3573.724 kilometers
  • 1929.657 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tacheng to Wuxi?

The estimated flight time from Tacheng Airport to Sunan Shuofang International Airport is 4 hours and 42 minutes.

Flight carbon footprint between Tacheng Airport (TCG) and Sunan Shuofang International Airport (WUX)

On average, flying from Tacheng to Wuxi generates about 243 kg of CO2 per passenger, and 243 kilograms equals 536 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tacheng to Wuxi

See the map of the shortest flight path between Tacheng Airport (TCG) and Sunan Shuofang International Airport (WUX).

Airport information

Origin Tacheng Airport
City: Tacheng
Country: China Flag of China
IATA Code: TCG
ICAO Code: ZWTC
Coordinates: 46°40′21″N, 83°20′26″E
Destination Sunan Shuofang International Airport
City: Wuxi
Country: China Flag of China
IATA Code: WUX
ICAO Code: ZSWX
Coordinates: 31°29′39″N, 120°25′44″E