Air Miles Calculator logo

How far is Jixi from Wuhai?

The distance between Wuhai (Wuhai Airport) and Jixi (Jixi Xingkaihu Airport) is 1296 miles / 2086 kilometers / 1126 nautical miles.

The driving distance from Wuhai (WUA) to Jixi (JXA) is 1564 miles / 2517 kilometers, and travel time by car is about 28 hours 42 minutes.

Wuhai Airport – Jixi Xingkaihu Airport

Distance arrow
1296
Miles
Distance arrow
2086
Kilometers
Distance arrow
1126
Nautical miles

Search flights

Distance from Wuhai to Jixi

There are several ways to calculate the distance from Wuhai to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1295.890 miles
  • 2085.534 kilometers
  • 1126.098 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1292.869 miles
  • 2080.671 kilometers
  • 1123.472 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wuhai to Jixi?

The estimated flight time from Wuhai Airport to Jixi Xingkaihu Airport is 2 hours and 57 minutes.

What is the time difference between Wuhai and Jixi?

There is no time difference between Wuhai and Jixi.

Flight carbon footprint between Wuhai Airport (WUA) and Jixi Xingkaihu Airport (JXA)

On average, flying from Wuhai to Jixi generates about 167 kg of CO2 per passenger, and 167 kilograms equals 367 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wuhai to Jixi

See the map of the shortest flight path between Wuhai Airport (WUA) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Wuhai Airport
City: Wuhai
Country: China Flag of China
IATA Code: WUA
ICAO Code: ZBUH
Coordinates: 39°47′36″N, 106°47′57″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E