Air Miles Calculator logo

How far is Jixi from Wuzhou?

The distance between Wuzhou (Wuzhou Changzhoudao Airport) and Jixi (Jixi Xingkaihu Airport) is 1877 miles / 3021 kilometers / 1631 nautical miles.

The driving distance from Wuzhou (WUZ) to Jixi (JXA) is 2279 miles / 3668 kilometers, and travel time by car is about 41 hours 9 minutes.

Wuzhou Changzhoudao Airport – Jixi Xingkaihu Airport

Distance arrow
1877
Miles
Distance arrow
3021
Kilometers
Distance arrow
1631
Nautical miles

Search flights

Distance from Wuzhou to Jixi

There are several ways to calculate the distance from Wuzhou to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1876.857 miles
  • 3020.508 kilometers
  • 1630.944 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1878.246 miles
  • 3022.744 kilometers
  • 1632.151 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wuzhou to Jixi?

The estimated flight time from Wuzhou Changzhoudao Airport to Jixi Xingkaihu Airport is 4 hours and 3 minutes.

What is the time difference between Wuzhou and Jixi?

There is no time difference between Wuzhou and Jixi.

Flight carbon footprint between Wuzhou Changzhoudao Airport (WUZ) and Jixi Xingkaihu Airport (JXA)

On average, flying from Wuzhou to Jixi generates about 206 kg of CO2 per passenger, and 206 kilograms equals 455 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wuzhou to Jixi

See the map of the shortest flight path between Wuzhou Changzhoudao Airport (WUZ) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Wuzhou Changzhoudao Airport
City: Wuzhou
Country: China Flag of China
IATA Code: WUZ
ICAO Code: ZGWZ
Coordinates: 23°27′24″N, 111°14′52″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E