Air Miles Calculator logo

How far is Shihezi from Sabetta?

The distance between Sabetta (Sabetta International Airport) and Shihezi (Shihezi Huayuan Airport) is 1925 miles / 3097 kilometers / 1672 nautical miles.

Sabetta International Airport – Shihezi Huayuan Airport

Distance arrow
1925
Miles
Distance arrow
3097
Kilometers
Distance arrow
1672
Nautical miles

Search flights

Distance from Sabetta to Shihezi

There are several ways to calculate the distance from Sabetta to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1924.675 miles
  • 3097.464 kilometers
  • 1672.497 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1921.538 miles
  • 3092.416 kilometers
  • 1669.771 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sabetta to Shihezi?

The estimated flight time from Sabetta International Airport to Shihezi Huayuan Airport is 4 hours and 8 minutes.

Flight carbon footprint between Sabetta International Airport (SBT) and Shihezi Huayuan Airport (SHF)

On average, flying from Sabetta to Shihezi generates about 211 kg of CO2 per passenger, and 211 kilograms equals 464 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Sabetta to Shihezi

See the map of the shortest flight path between Sabetta International Airport (SBT) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Sabetta International Airport
City: Sabetta
Country: Russia Flag of Russia
IATA Code: SBT
ICAO Code: USDA
Coordinates: 71°13′9″N, 72°3′7″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E