Air Miles Calculator logo

How far is Shanghai from Jeju?

The distance between Jeju (Jeju International Airport) and Shanghai (Shanghai Hongqiao International Airport) is 341 miles / 549 kilometers / 296 nautical miles.

The driving distance from Jeju (CJU) to Shanghai (SHA) is 1772 miles / 2852 kilometers, and travel time by car is about 33 hours 9 minutes.

Jeju International Airport – Shanghai Hongqiao International Airport

Distance arrow
341
Miles
Distance arrow
549
Kilometers
Distance arrow
296
Nautical miles

Search flights

Distance from Jeju to Shanghai

There are several ways to calculate the distance from Jeju to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 341.090 miles
  • 548.931 kilometers
  • 296.399 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 340.740 miles
  • 548.368 kilometers
  • 296.095 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jeju to Shanghai?

The estimated flight time from Jeju International Airport to Shanghai Hongqiao International Airport is 1 hour and 8 minutes.

Flight carbon footprint between Jeju International Airport (CJU) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Jeju to Shanghai generates about 75 kg of CO2 per passenger, and 75 kilograms equals 166 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jeju to Shanghai

See the map of the shortest flight path between Jeju International Airport (CJU) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Jeju International Airport
City: Jeju
Country: South Korea Flag of South Korea
IATA Code: CJU
ICAO Code: RKPC
Coordinates: 33°30′40″N, 126°29′34″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E