Air Miles Calculator logo

How far is Fujairah from Shanghai?

The distance between Shanghai (Shanghai Hongqiao International Airport) and Fujairah (Fujairah International Airport) is 3934 miles / 6331 kilometers / 3419 nautical miles.

The driving distance from Shanghai (SHA) to Fujairah (FJR) is 6393 miles / 10289 kilometers, and travel time by car is about 120 hours 45 minutes.

Shanghai Hongqiao International Airport – Fujairah International Airport

Distance arrow
3934
Miles
Distance arrow
6331
Kilometers
Distance arrow
3419
Nautical miles

Search flights

Distance from Shanghai to Fujairah

There are several ways to calculate the distance from Shanghai to Fujairah. Here are two standard methods:

Vincenty's formula (applied above)
  • 3934.153 miles
  • 6331.406 kilometers
  • 3418.686 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3927.088 miles
  • 6320.036 kilometers
  • 3412.546 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Fujairah?

The estimated flight time from Shanghai Hongqiao International Airport to Fujairah International Airport is 7 hours and 56 minutes.

Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Fujairah International Airport (FJR)

On average, flying from Shanghai to Fujairah generates about 448 kg of CO2 per passenger, and 448 kilograms equals 988 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shanghai to Fujairah

See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Fujairah International Airport (FJR).

Airport information

Origin Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E
Destination Fujairah International Airport
City: Fujairah
Country: United Arab Emirates Flag of United Arab Emirates
IATA Code: FJR
ICAO Code: OMFJ
Coordinates: 25°6′43″N, 56°19′26″E