Air Miles Calculator logo

How far is Shanghai from Saipan?

The distance between Saipan (Saipan International Airport) and Shanghai (Shanghai Hongqiao International Airport) is 1898 miles / 3055 kilometers / 1649 nautical miles.

Saipan International Airport – Shanghai Hongqiao International Airport

Distance arrow
1898
Miles
Distance arrow
3055
Kilometers
Distance arrow
1649
Nautical miles

Search flights

Distance from Saipan to Shanghai

There are several ways to calculate the distance from Saipan to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 1898.192 miles
  • 3054.843 kilometers
  • 1649.483 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1898.724 miles
  • 3055.700 kilometers
  • 1649.946 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saipan to Shanghai?

The estimated flight time from Saipan International Airport to Shanghai Hongqiao International Airport is 4 hours and 5 minutes.

Flight carbon footprint between Saipan International Airport (SPN) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Saipan to Shanghai generates about 208 kg of CO2 per passenger, and 208 kilograms equals 459 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Saipan to Shanghai

See the map of the shortest flight path between Saipan International Airport (SPN) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Saipan International Airport
City: Saipan
Country: Northern Mariana Islands Flag of Northern Mariana Islands
IATA Code: SPN
ICAO Code: PGSN
Coordinates: 15°7′8″N, 145°43′44″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E