Air Miles Calculator logo

How far is Shihezi from Chifeng?

The distance between Chifeng (Chifeng Yulong Airport) and Shihezi (Shihezi Huayuan Airport) is 1661 miles / 2673 kilometers / 1443 nautical miles.

The driving distance from Chifeng (CIF) to Shihezi (SHF) is 2001 miles / 3220 kilometers, and travel time by car is about 36 hours 10 minutes.

Chifeng Yulong Airport – Shihezi Huayuan Airport

Distance arrow
1661
Miles
Distance arrow
2673
Kilometers
Distance arrow
1443
Nautical miles

Search flights

Distance from Chifeng to Shihezi

There are several ways to calculate the distance from Chifeng to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1660.902 miles
  • 2672.963 kilometers
  • 1443.284 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1656.480 miles
  • 2665.847 kilometers
  • 1439.442 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chifeng to Shihezi?

The estimated flight time from Chifeng Yulong Airport to Shihezi Huayuan Airport is 3 hours and 38 minutes.

Flight carbon footprint between Chifeng Yulong Airport (CIF) and Shihezi Huayuan Airport (SHF)

On average, flying from Chifeng to Shihezi generates about 190 kg of CO2 per passenger, and 190 kilograms equals 419 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Chifeng to Shihezi

See the map of the shortest flight path between Chifeng Yulong Airport (CIF) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Chifeng Yulong Airport
City: Chifeng
Country: China Flag of China
IATA Code: CIF
ICAO Code: ZBCF
Coordinates: 42°14′6″N, 118°54′28″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E