Air Miles Calculator logo

How far is Zhuhai from Phuket?

The distance between Phuket (Phuket International Airport) and Zhuhai (Zhuhai Jinwan Airport) is 1385 miles / 2229 kilometers / 1204 nautical miles.

The driving distance from Phuket (HKT) to Zhuhai (ZUH) is 1926 miles / 3099 kilometers, and travel time by car is about 37 hours 44 minutes.

Phuket International Airport – Zhuhai Jinwan Airport

Distance arrow
1385
Miles
Distance arrow
2229
Kilometers
Distance arrow
1204
Nautical miles

Search flights

Distance from Phuket to Zhuhai

There are several ways to calculate the distance from Phuket to Zhuhai. Here are two standard methods:

Vincenty's formula (applied above)
  • 1385.037 miles
  • 2229.001 kilometers
  • 1203.564 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1387.281 miles
  • 2232.613 kilometers
  • 1205.515 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phuket to Zhuhai?

The estimated flight time from Phuket International Airport to Zhuhai Jinwan Airport is 3 hours and 7 minutes.

Flight carbon footprint between Phuket International Airport (HKT) and Zhuhai Jinwan Airport (ZUH)

On average, flying from Phuket to Zhuhai generates about 172 kg of CO2 per passenger, and 172 kilograms equals 380 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Phuket to Zhuhai

See the map of the shortest flight path between Phuket International Airport (HKT) and Zhuhai Jinwan Airport (ZUH).

Airport information

Origin Phuket International Airport
City: Phuket
Country: Thailand Flag of Thailand
IATA Code: HKT
ICAO Code: VTSP
Coordinates: 8°6′47″N, 98°19′0″E
Destination Zhuhai Jinwan Airport
City: Zhuhai
Country: China Flag of China
IATA Code: ZUH
ICAO Code: ZGSD
Coordinates: 22°0′23″N, 113°22′33″E