Air Miles Calculator logo

How far is Baise from Shache?

The distance between Shache (Shache Airport) and Baise (Baise Bama Airport) is 2024 miles / 3257 kilometers / 1759 nautical miles.

The driving distance from Shache (QSZ) to Baise (AEB) is 2696 miles / 4339 kilometers, and travel time by car is about 51 hours 40 minutes.

Shache Airport – Baise Bama Airport

Distance arrow
2024
Miles
Distance arrow
3257
Kilometers
Distance arrow
1759
Nautical miles

Search flights

Distance from Shache to Baise

There are several ways to calculate the distance from Shache to Baise. Here are two standard methods:

Vincenty's formula (applied above)
  • 2024.004 miles
  • 3257.319 kilometers
  • 1758.812 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2022.397 miles
  • 3254.733 kilometers
  • 1757.415 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shache to Baise?

The estimated flight time from Shache Airport to Baise Bama Airport is 4 hours and 19 minutes.

What is the time difference between Shache and Baise?

There is no time difference between Shache and Baise.

Flight carbon footprint between Shache Airport (QSZ) and Baise Bama Airport (AEB)

On average, flying from Shache to Baise generates about 220 kg of CO2 per passenger, and 220 kilograms equals 486 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shache to Baise

See the map of the shortest flight path between Shache Airport (QSZ) and Baise Bama Airport (AEB).

Airport information

Origin Shache Airport
City: Shache
Country: China Flag of China
IATA Code: QSZ
ICAO Code: ZWSC
Coordinates: 38°16′51″N, 77°4′30″E
Destination Baise Bama Airport
City: Baise
Country: China Flag of China
IATA Code: AEB
ICAO Code: ZGBS
Coordinates: 23°43′14″N, 106°57′35″E