Air Miles Calculator logo

How far is Lijiang from Bole?

The distance between Bole (Alashankou Bole (Bortala) airport) and Lijiang (Lijiang Sanyi International Airport) is 1603 miles / 2579 kilometers / 1393 nautical miles.

The driving distance from Bole (BPL) to Lijiang (LJG) is 2490 miles / 4007 kilometers, and travel time by car is about 45 hours 19 minutes.

Alashankou Bole (Bortala) airport – Lijiang Sanyi International Airport

Distance arrow
1603
Miles
Distance arrow
2579
Kilometers
Distance arrow
1393
Nautical miles

Search flights

Distance from Bole to Lijiang

There are several ways to calculate the distance from Bole to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1602.709 miles
  • 2579.310 kilometers
  • 1392.716 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1603.422 miles
  • 2580.458 kilometers
  • 1393.336 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bole to Lijiang?

The estimated flight time from Alashankou Bole (Bortala) airport to Lijiang Sanyi International Airport is 3 hours and 32 minutes.

Flight carbon footprint between Alashankou Bole (Bortala) airport (BPL) and Lijiang Sanyi International Airport (LJG)

On average, flying from Bole to Lijiang generates about 186 kg of CO2 per passenger, and 186 kilograms equals 410 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bole to Lijiang

See the map of the shortest flight path between Alashankou Bole (Bortala) airport (BPL) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E