Air Miles Calculator logo

How far is Shache from Yushu?

The distance between Yushu (Yushu Batang Airport) and Shache (Shache Airport) is 1183 miles / 1903 kilometers / 1028 nautical miles.

The driving distance from Yushu (YUS) to Shache (QSZ) is 1687 miles / 2715 kilometers, and travel time by car is about 32 hours 50 minutes.

Yushu Batang Airport – Shache Airport

Distance arrow
1183
Miles
Distance arrow
1903
Kilometers
Distance arrow
1028
Nautical miles

Search flights

Distance from Yushu to Shache

There are several ways to calculate the distance from Yushu to Shache. Here are two standard methods:

Vincenty's formula (applied above)
  • 1182.660 miles
  • 1903.307 kilometers
  • 1027.704 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1180.532 miles
  • 1899.883 kilometers
  • 1025.855 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yushu to Shache?

The estimated flight time from Yushu Batang Airport to Shache Airport is 2 hours and 44 minutes.

What is the time difference between Yushu and Shache?

There is no time difference between Yushu and Shache.

Flight carbon footprint between Yushu Batang Airport (YUS) and Shache Airport (QSZ)

On average, flying from Yushu to Shache generates about 161 kg of CO2 per passenger, and 161 kilograms equals 354 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yushu to Shache

See the map of the shortest flight path between Yushu Batang Airport (YUS) and Shache Airport (QSZ).

Airport information

Origin Yushu Batang Airport
City: Yushu
Country: China Flag of China
IATA Code: YUS
ICAO Code: ZYLS
Coordinates: 32°50′11″N, 97°2′11″E
Destination Shache Airport
City: Shache
Country: China Flag of China
IATA Code: QSZ
ICAO Code: ZWSC
Coordinates: 38°16′51″N, 77°4′30″E