Air Miles Calculator logo

How far is Sharjah from Qinhuangdao?

The distance between Qinhuangdao (Qinhuangdao Beidaihe Airport) and Sharjah (Sharjah International Airport) is 3761 miles / 6052 kilometers / 3268 nautical miles.

The driving distance from Qinhuangdao (BPE) to Sharjah (SHJ) is 5818 miles / 9363 kilometers, and travel time by car is about 110 hours 45 minutes.

Qinhuangdao Beidaihe Airport – Sharjah International Airport

Distance arrow
3761
Miles
Distance arrow
6052
Kilometers
Distance arrow
3268
Nautical miles

Search flights

Distance from Qinhuangdao to Sharjah

There are several ways to calculate the distance from Qinhuangdao to Sharjah. Here are two standard methods:

Vincenty's formula (applied above)
  • 3760.589 miles
  • 6052.081 kilometers
  • 3267.862 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3754.032 miles
  • 6041.528 kilometers
  • 3262.164 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Qinhuangdao to Sharjah?

The estimated flight time from Qinhuangdao Beidaihe Airport to Sharjah International Airport is 7 hours and 37 minutes.

Flight carbon footprint between Qinhuangdao Beidaihe Airport (BPE) and Sharjah International Airport (SHJ)

On average, flying from Qinhuangdao to Sharjah generates about 427 kg of CO2 per passenger, and 427 kilograms equals 941 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Qinhuangdao to Sharjah

See the map of the shortest flight path between Qinhuangdao Beidaihe Airport (BPE) and Sharjah International Airport (SHJ).

Airport information

Origin Qinhuangdao Beidaihe Airport
City: Qinhuangdao
Country: China Flag of China
IATA Code: BPE
ICAO Code: ZBDH
Coordinates: 39°39′59″N, 119°3′32″E
Destination Sharjah International Airport
City: Sharjah
Country: United Arab Emirates Flag of United Arab Emirates
IATA Code: SHJ
ICAO Code: OMSJ
Coordinates: 25°19′42″N, 55°31′1″E