Air Miles Calculator logo

How far is Bole from Yulin?

The distance between Yulin (Yulin Yuyang Airport) and Bole (Alashankou Bole (Bortala) airport) is 1485 miles / 2390 kilometers / 1290 nautical miles.

The driving distance from Yulin (UYN) to Bole (BPL) is 1780 miles / 2864 kilometers, and travel time by car is about 32 hours 1 minutes.

Yulin Yuyang Airport – Alashankou Bole (Bortala) airport

Distance arrow
1485
Miles
Distance arrow
2390
Kilometers
Distance arrow
1290
Nautical miles

Search flights

Distance from Yulin to Bole

There are several ways to calculate the distance from Yulin to Bole. Here are two standard methods:

Vincenty's formula (applied above)
  • 1484.812 miles
  • 2389.573 kilometers
  • 1290.266 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1481.499 miles
  • 2384.241 kilometers
  • 1287.387 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yulin to Bole?

The estimated flight time from Yulin Yuyang Airport to Alashankou Bole (Bortala) airport is 3 hours and 18 minutes.

Flight carbon footprint between Yulin Yuyang Airport (UYN) and Alashankou Bole (Bortala) airport (BPL)

On average, flying from Yulin to Bole generates about 179 kg of CO2 per passenger, and 179 kilograms equals 394 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yulin to Bole

See the map of the shortest flight path between Yulin Yuyang Airport (UYN) and Alashankou Bole (Bortala) airport (BPL).

Airport information

Origin Yulin Yuyang Airport
City: Yulin
Country: China Flag of China
IATA Code: UYN
ICAO Code: ZLYL
Coordinates: 38°16′9″N, 109°43′51″E
Destination Alashankou Bole (Bortala) airport
City: Bole
Country: China Flag of China
IATA Code: BPL
ICAO Code: ZWBL
Coordinates: 44°53′42″N, 82°18′0″E