Air Miles Calculator logo

How far is Jixi from Shache?

The distance between Shache (Shache Airport) and Jixi (Jixi Xingkaihu Airport) is 2783 miles / 4479 kilometers / 2419 nautical miles.

The driving distance from Shache (QSZ) to Jixi (JXA) is 3401 miles / 5473 kilometers, and travel time by car is about 62 hours 27 minutes.

Shache Airport – Jixi Xingkaihu Airport

Distance arrow
2783
Miles
Distance arrow
4479
Kilometers
Distance arrow
2419
Nautical miles

Search flights

Distance from Shache to Jixi

There are several ways to calculate the distance from Shache to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2783.412 miles
  • 4479.468 kilometers
  • 2418.719 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2776.499 miles
  • 4468.342 kilometers
  • 2412.711 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shache to Jixi?

The estimated flight time from Shache Airport to Jixi Xingkaihu Airport is 5 hours and 46 minutes.

What is the time difference between Shache and Jixi?

There is no time difference between Shache and Jixi.

Flight carbon footprint between Shache Airport (QSZ) and Jixi Xingkaihu Airport (JXA)

On average, flying from Shache to Jixi generates about 308 kg of CO2 per passenger, and 308 kilograms equals 680 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shache to Jixi

See the map of the shortest flight path between Shache Airport (QSZ) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Shache Airport
City: Shache
Country: China Flag of China
IATA Code: QSZ
ICAO Code: ZWSC
Coordinates: 38°16′51″N, 77°4′30″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E