Air Miles Calculator logo

How far is Jiansanjiang from Lhasa?

The distance between Lhasa (Lhasa Gonggar Airport) and Jiansanjiang (Jiansanjiang Airport) is 2542 miles / 4091 kilometers / 2209 nautical miles.

The driving distance from Lhasa (LXA) to Jiansanjiang (JSJ) is 3327 miles / 5354 kilometers, and travel time by car is about 60 hours 57 minutes.

Lhasa Gonggar Airport – Jiansanjiang Airport

Distance arrow
2542
Miles
Distance arrow
4091
Kilometers
Distance arrow
2209
Nautical miles

Search flights

Distance from Lhasa to Jiansanjiang

There are several ways to calculate the distance from Lhasa to Jiansanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 2542.144 miles
  • 4091.184 kilometers
  • 2209.063 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2538.539 miles
  • 4085.383 kilometers
  • 2205.930 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lhasa to Jiansanjiang?

The estimated flight time from Lhasa Gonggar Airport to Jiansanjiang Airport is 5 hours and 18 minutes.

Flight carbon footprint between Lhasa Gonggar Airport (LXA) and Jiansanjiang Airport (JSJ)

On average, flying from Lhasa to Jiansanjiang generates about 280 kg of CO2 per passenger, and 280 kilograms equals 618 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lhasa to Jiansanjiang

See the map of the shortest flight path between Lhasa Gonggar Airport (LXA) and Jiansanjiang Airport (JSJ).

Airport information

Origin Lhasa Gonggar Airport
City: Lhasa
Country: China Flag of China
IATA Code: LXA
ICAO Code: ZULS
Coordinates: 29°17′52″N, 90°54′42″E
Destination Jiansanjiang Airport
City: Jiansanjiang
Country: China Flag of China
IATA Code: JSJ
ICAO Code: ZYJS
Coordinates: 47°6′36″N, 132°39′37″E