Air Miles Calculator logo

How far is Jiansanjiang from Turpan?

The distance between Turpan (Turpan Jiaohe Airport) and Jiansanjiang (Jiansanjiang Airport) is 2122 miles / 3415 kilometers / 1844 nautical miles.

The driving distance from Turpan (TLQ) to Jiansanjiang (JSJ) is 2656 miles / 4274 kilometers, and travel time by car is about 48 hours 6 minutes.

Turpan Jiaohe Airport – Jiansanjiang Airport

Distance arrow
2122
Miles
Distance arrow
3415
Kilometers
Distance arrow
1844
Nautical miles

Search flights

Distance from Turpan to Jiansanjiang

There are several ways to calculate the distance from Turpan to Jiansanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 2122.166 miles
  • 3415.296 kilometers
  • 1844.112 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2116.366 miles
  • 3405.961 kilometers
  • 1839.072 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Turpan to Jiansanjiang?

The estimated flight time from Turpan Jiaohe Airport to Jiansanjiang Airport is 4 hours and 31 minutes.

Flight carbon footprint between Turpan Jiaohe Airport (TLQ) and Jiansanjiang Airport (JSJ)

On average, flying from Turpan to Jiansanjiang generates about 231 kg of CO2 per passenger, and 231 kilograms equals 510 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Turpan to Jiansanjiang

See the map of the shortest flight path between Turpan Jiaohe Airport (TLQ) and Jiansanjiang Airport (JSJ).

Airport information

Origin Turpan Jiaohe Airport
City: Turpan
Country: China Flag of China
IATA Code: TLQ
ICAO Code: ZWTP
Coordinates: 43°1′50″N, 89°5′55″E
Destination Jiansanjiang Airport
City: Jiansanjiang
Country: China Flag of China
IATA Code: JSJ
ICAO Code: ZYJS
Coordinates: 47°6′36″N, 132°39′37″E