Air Miles Calculator logo

How far is Sharjah from Guangzhou?

The distance between Guangzhou (Guangzhou Baiyun International Airport) and Sharjah (Sharjah International Airport) is 3617 miles / 5820 kilometers / 3143 nautical miles.

The driving distance from Guangzhou (CAN) to Sharjah (SHJ) is 6478 miles / 10426 kilometers, and travel time by car is about 122 hours 44 minutes.

Guangzhou Baiyun International Airport – Sharjah International Airport

Distance arrow
3617
Miles
Distance arrow
5820
Kilometers
Distance arrow
3143
Nautical miles

Search flights

Distance from Guangzhou to Sharjah

There are several ways to calculate the distance from Guangzhou to Sharjah. Here are two standard methods:

Vincenty's formula (applied above)
  • 3616.594 miles
  • 5820.343 kilometers
  • 3142.734 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3610.535 miles
  • 5810.593 kilometers
  • 3137.469 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Guangzhou to Sharjah?

The estimated flight time from Guangzhou Baiyun International Airport to Sharjah International Airport is 7 hours and 20 minutes.

Flight carbon footprint between Guangzhou Baiyun International Airport (CAN) and Sharjah International Airport (SHJ)

On average, flying from Guangzhou to Sharjah generates about 409 kg of CO2 per passenger, and 409 kilograms equals 901 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Guangzhou to Sharjah

See the map of the shortest flight path between Guangzhou Baiyun International Airport (CAN) and Sharjah International Airport (SHJ).

Airport information

Origin Guangzhou Baiyun International Airport
City: Guangzhou
Country: China Flag of China
IATA Code: CAN
ICAO Code: ZGGG
Coordinates: 23°23′32″N, 113°17′56″E
Destination Sharjah International Airport
City: Sharjah
Country: United Arab Emirates Flag of United Arab Emirates
IATA Code: SHJ
ICAO Code: OMSJ
Coordinates: 25°19′42″N, 55°31′1″E