Air Miles Calculator logo

How far is Shanghai from Wanxian?

The distance between Wanxian (Wanzhou Wuqiao Airport) and Shanghai (Shanghai Hongqiao International Airport) is 767 miles / 1235 kilometers / 667 nautical miles.

The driving distance from Wanxian (WXN) to Shanghai (SHA) is 908 miles / 1461 kilometers, and travel time by car is about 16 hours 20 minutes.

Wanzhou Wuqiao Airport – Shanghai Hongqiao International Airport

Distance arrow
767
Miles
Distance arrow
1235
Kilometers
Distance arrow
667
Nautical miles

Search flights

Distance from Wanxian to Shanghai

There are several ways to calculate the distance from Wanxian to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 767.145 miles
  • 1234.600 kilometers
  • 666.631 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 765.610 miles
  • 1232.131 kilometers
  • 665.297 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wanxian to Shanghai?

The estimated flight time from Wanzhou Wuqiao Airport to Shanghai Hongqiao International Airport is 1 hour and 57 minutes.

What is the time difference between Wanxian and Shanghai?

There is no time difference between Wanxian and Shanghai.

Flight carbon footprint between Wanzhou Wuqiao Airport (WXN) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Wanxian to Shanghai generates about 132 kg of CO2 per passenger, and 132 kilograms equals 290 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wanxian to Shanghai

See the map of the shortest flight path between Wanzhou Wuqiao Airport (WXN) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Wanzhou Wuqiao Airport
City: Wanxian
Country: China Flag of China
IATA Code: WXN
ICAO Code: ZUWX
Coordinates: 30°50′9″N, 108°24′21″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E