Air Miles Calculator logo

How far is Jixi from Shihezi?

The distance between Shihezi (Shihezi Huayuan Airport) and Jixi (Jixi Xingkaihu Airport) is 2200 miles / 3540 kilometers / 1912 nautical miles.

The driving distance from Shihezi (SHF) to Jixi (JXA) is 2755 miles / 4433 kilometers, and travel time by car is about 49 hours 56 minutes.

Shihezi Huayuan Airport – Jixi Xingkaihu Airport

Distance arrow
2200
Miles
Distance arrow
3540
Kilometers
Distance arrow
1912
Nautical miles

Search flights

Distance from Shihezi to Jixi

There are several ways to calculate the distance from Shihezi to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2199.943 miles
  • 3540.465 kilometers
  • 1911.698 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2193.845 miles
  • 3530.651 kilometers
  • 1906.399 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shihezi to Jixi?

The estimated flight time from Shihezi Huayuan Airport to Jixi Xingkaihu Airport is 4 hours and 39 minutes.

Flight carbon footprint between Shihezi Huayuan Airport (SHF) and Jixi Xingkaihu Airport (JXA)

On average, flying from Shihezi to Jixi generates about 240 kg of CO2 per passenger, and 240 kilograms equals 530 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shihezi to Jixi

See the map of the shortest flight path between Shihezi Huayuan Airport (SHF) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E