Air Miles Calculator logo

How far is Weifang from Shanghai?

The distance between Shanghai (Shanghai Hongqiao International Airport) and Weifang (Weifang Nanyuan Airport) is 397 miles / 638 kilometers / 345 nautical miles.

The driving distance from Shanghai (SHA) to Weifang (WEF) is 449 miles / 722 kilometers, and travel time by car is about 8 hours 16 minutes.

Shanghai Hongqiao International Airport – Weifang Nanyuan Airport

Distance arrow
397
Miles
Distance arrow
638
Kilometers
Distance arrow
345
Nautical miles

Search flights

Distance from Shanghai to Weifang

There are several ways to calculate the distance from Shanghai to Weifang. Here are two standard methods:

Vincenty's formula (applied above)
  • 396.528 miles
  • 638.150 kilometers
  • 344.573 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 397.318 miles
  • 639.421 kilometers
  • 345.260 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Weifang?

The estimated flight time from Shanghai Hongqiao International Airport to Weifang Nanyuan Airport is 1 hour and 15 minutes.

What is the time difference between Shanghai and Weifang?

There is no time difference between Shanghai and Weifang.

Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Weifang Nanyuan Airport (WEF)

On average, flying from Shanghai to Weifang generates about 83 kg of CO2 per passenger, and 83 kilograms equals 184 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shanghai to Weifang

See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Weifang Nanyuan Airport (WEF).

Airport information

Origin Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E
Destination Weifang Nanyuan Airport
City: Weifang
Country: China Flag of China
IATA Code: WEF
ICAO Code: ZSWF
Coordinates: 36°38′48″N, 119°7′8″E