Air Miles Calculator logo

How far is Shanghai from Vancouver?

The distance between Vancouver (Vancouver International Airport) and Shanghai (Shanghai Hongqiao International Airport) is 5628 miles / 9058 kilometers / 4891 nautical miles.

Vancouver International Airport – Shanghai Hongqiao International Airport

Distance arrow
5628
Miles
Distance arrow
9058
Kilometers
Distance arrow
4891
Nautical miles

Search flights

Distance from Vancouver to Shanghai

There are several ways to calculate the distance from Vancouver to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 5628.465 miles
  • 9058.137 kilometers
  • 4891.003 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5615.908 miles
  • 9037.928 kilometers
  • 4880.091 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Vancouver to Shanghai?

The estimated flight time from Vancouver International Airport to Shanghai Hongqiao International Airport is 11 hours and 9 minutes.

Flight carbon footprint between Vancouver International Airport (YVR) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Vancouver to Shanghai generates about 667 kg of CO2 per passenger, and 667 kilograms equals 1 470 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Vancouver to Shanghai

See the map of the shortest flight path between Vancouver International Airport (YVR) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Vancouver International Airport
City: Vancouver
Country: Canada Flag of Canada
IATA Code: YVR
ICAO Code: CYVR
Coordinates: 49°11′38″N, 123°11′2″W
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E