Air Miles Calculator logo

How far is Xuzhou from Beihai?

The distance between Beihai (Beihai Fucheng Airport) and Xuzhou (Xuzhou Guanyin International Airport) is 1000 miles / 1610 kilometers / 869 nautical miles.

The driving distance from Beihai (BHY) to Xuzhou (XUZ) is 1232 miles / 1982 kilometers, and travel time by car is about 22 hours 24 minutes.

Beihai Fucheng Airport – Xuzhou Guanyin International Airport

Distance arrow
1000
Miles
Distance arrow
1610
Kilometers
Distance arrow
869
Nautical miles

Search flights

Distance from Beihai to Xuzhou

There are several ways to calculate the distance from Beihai to Xuzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 1000.492 miles
  • 1610.135 kilometers
  • 869.404 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1002.670 miles
  • 1613.641 kilometers
  • 871.297 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beihai to Xuzhou?

The estimated flight time from Beihai Fucheng Airport to Xuzhou Guanyin International Airport is 2 hours and 23 minutes.

What is the time difference between Beihai and Xuzhou?

There is no time difference between Beihai and Xuzhou.

Flight carbon footprint between Beihai Fucheng Airport (BHY) and Xuzhou Guanyin International Airport (XUZ)

On average, flying from Beihai to Xuzhou generates about 151 kg of CO2 per passenger, and 151 kilograms equals 333 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Beihai to Xuzhou

See the map of the shortest flight path between Beihai Fucheng Airport (BHY) and Xuzhou Guanyin International Airport (XUZ).

Airport information

Origin Beihai Fucheng Airport
City: Beihai
Country: China Flag of China
IATA Code: BHY
ICAO Code: ZGBH
Coordinates: 21°32′21″N, 109°17′38″E
Destination Xuzhou Guanyin International Airport
City: Xuzhou
Country: China Flag of China
IATA Code: XUZ
ICAO Code: ZSXZ
Coordinates: 34°17′17″N, 117°10′15″E