Air Miles Calculator logo

How far is Yushu from Lijiang?

The distance between Lijiang (Lijiang Sanyi International Airport) and Yushu (Yushu Batang Airport) is 466 miles / 750 kilometers / 405 nautical miles.

The driving distance from Lijiang (LJG) to Yushu (YUS) is 717 miles / 1154 kilometers, and travel time by car is about 15 hours 3 minutes.

Lijiang Sanyi International Airport – Yushu Batang Airport

Distance arrow
466
Miles
Distance arrow
750
Kilometers
Distance arrow
405
Nautical miles

Search flights

Distance from Lijiang to Yushu

There are several ways to calculate the distance from Lijiang to Yushu. Here are two standard methods:

Vincenty's formula (applied above)
  • 465.823 miles
  • 749.669 kilometers
  • 404.789 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 466.875 miles
  • 751.362 kilometers
  • 405.703 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lijiang to Yushu?

The estimated flight time from Lijiang Sanyi International Airport to Yushu Batang Airport is 1 hour and 22 minutes.

What is the time difference between Lijiang and Yushu?

There is no time difference between Lijiang and Yushu.

Flight carbon footprint between Lijiang Sanyi International Airport (LJG) and Yushu Batang Airport (YUS)

On average, flying from Lijiang to Yushu generates about 94 kg of CO2 per passenger, and 94 kilograms equals 206 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lijiang to Yushu

See the map of the shortest flight path between Lijiang Sanyi International Airport (LJG) and Yushu Batang Airport (YUS).

Airport information

Origin Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E
Destination Yushu Batang Airport
City: Yushu
Country: China Flag of China
IATA Code: YUS
ICAO Code: ZYLS
Coordinates: 32°50′11″N, 97°2′11″E