Air Miles Calculator logo

How far is Xuzhou from Anshan?

The distance between Anshan (Anshan Teng'ao Airport) and Xuzhou (Xuzhou Guanyin International Airport) is 564 miles / 907 kilometers / 490 nautical miles.

The driving distance from Anshan (AOG) to Xuzhou (XUZ) is 747 miles / 1202 kilometers, and travel time by car is about 13 hours 44 minutes.

Anshan Teng'ao Airport – Xuzhou Guanyin International Airport

Distance arrow
564
Miles
Distance arrow
907
Kilometers
Distance arrow
490
Nautical miles

Search flights

Distance from Anshan to Xuzhou

There are several ways to calculate the distance from Anshan to Xuzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 563.640 miles
  • 907.090 kilometers
  • 489.790 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 563.956 miles
  • 907.599 kilometers
  • 490.064 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Anshan to Xuzhou?

The estimated flight time from Anshan Teng'ao Airport to Xuzhou Guanyin International Airport is 1 hour and 34 minutes.

What is the time difference between Anshan and Xuzhou?

There is no time difference between Anshan and Xuzhou.

Flight carbon footprint between Anshan Teng'ao Airport (AOG) and Xuzhou Guanyin International Airport (XUZ)

On average, flying from Anshan to Xuzhou generates about 108 kg of CO2 per passenger, and 108 kilograms equals 238 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Anshan to Xuzhou

See the map of the shortest flight path between Anshan Teng'ao Airport (AOG) and Xuzhou Guanyin International Airport (XUZ).

Airport information

Origin Anshan Teng'ao Airport
City: Anshan
Country: China Flag of China
IATA Code: AOG
ICAO Code: ZYAS
Coordinates: 41°6′19″N, 122°51′14″E
Destination Xuzhou Guanyin International Airport
City: Xuzhou
Country: China Flag of China
IATA Code: XUZ
ICAO Code: ZSXZ
Coordinates: 34°17′17″N, 117°10′15″E