Air Miles Calculator logo

How far is Yichun from Shanghai?

The distance between Shanghai (Shanghai Hongqiao International Airport) and Yichun (Yichun Lindu Airport) is 1212 miles / 1951 kilometers / 1053 nautical miles.

The driving distance from Shanghai (SHA) to Yichun (LDS) is 1607 miles / 2586 kilometers, and travel time by car is about 29 hours 25 minutes.

Shanghai Hongqiao International Airport – Yichun Lindu Airport

Distance arrow
1212
Miles
Distance arrow
1951
Kilometers
Distance arrow
1053
Nautical miles

Search flights

Distance from Shanghai to Yichun

There are several ways to calculate the distance from Shanghai to Yichun. Here are two standard methods:

Vincenty's formula (applied above)
  • 1212.155 miles
  • 1950.775 kilometers
  • 1053.334 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1213.459 miles
  • 1952.874 kilometers
  • 1054.467 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Yichun?

The estimated flight time from Shanghai Hongqiao International Airport to Yichun Lindu Airport is 2 hours and 47 minutes.

What is the time difference between Shanghai and Yichun?

There is no time difference between Shanghai and Yichun.

Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Yichun Lindu Airport (LDS)

On average, flying from Shanghai to Yichun generates about 162 kg of CO2 per passenger, and 162 kilograms equals 357 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shanghai to Yichun

See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Yichun Lindu Airport (LDS).

Airport information

Origin Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E
Destination Yichun Lindu Airport
City: Yichun
Country: China Flag of China
IATA Code: LDS
ICAO Code: ZYLD
Coordinates: 47°45′7″N, 129°1′8″E