Air Miles Calculator logo

How far is Jambi from Phuket?

The distance between Phuket (Phuket International Airport) and Jambi (Sultan Thaha Syaifuddin Airport) is 764 miles / 1230 kilometers / 664 nautical miles.

Phuket International Airport – Sultan Thaha Syaifuddin Airport

Distance arrow
764
Miles
Distance arrow
1230
Kilometers
Distance arrow
664
Nautical miles

Search flights

Distance from Phuket to Jambi

There are several ways to calculate the distance from Phuket to Jambi. Here are two standard methods:

Vincenty's formula (applied above)
  • 764.164 miles
  • 1229.803 kilometers
  • 664.040 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 767.232 miles
  • 1234.740 kilometers
  • 666.706 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Phuket to Jambi?

The estimated flight time from Phuket International Airport to Sultan Thaha Syaifuddin Airport is 1 hour and 56 minutes.

What is the time difference between Phuket and Jambi?

There is no time difference between Phuket and Jambi.

Flight carbon footprint between Phuket International Airport (HKT) and Sultan Thaha Syaifuddin Airport (DJB)

On average, flying from Phuket to Jambi generates about 131 kg of CO2 per passenger, and 131 kilograms equals 290 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Phuket to Jambi

See the map of the shortest flight path between Phuket International Airport (HKT) and Sultan Thaha Syaifuddin Airport (DJB).

Airport information

Origin Phuket International Airport
City: Phuket
Country: Thailand Flag of Thailand
IATA Code: HKT
ICAO Code: VTSP
Coordinates: 8°6′47″N, 98°19′0″E
Destination Sultan Thaha Syaifuddin Airport
City: Jambi
Country: Indonesia Flag of Indonesia
IATA Code: DJB
ICAO Code: WIPA
Coordinates: 1°38′16″S, 103°38′38″E