Air Miles Calculator logo

How far is Aqaba from Al-Baha?

The distance between Al-Baha (Al-Baha Domestic Airport) and Aqaba (King Hussein International Airport) is 763 miles / 1229 kilometers / 663 nautical miles.

The driving distance from Al-Baha (ABT) to Aqaba (AQJ) is 914 miles / 1471 kilometers, and travel time by car is about 17 hours 44 minutes.

Al-Baha Domestic Airport – King Hussein International Airport

Distance arrow
763
Miles
Distance arrow
1229
Kilometers
Distance arrow
663
Nautical miles

Search flights

Distance from Al-Baha to Aqaba

There are several ways to calculate the distance from Al-Baha to Aqaba. Here are two standard methods:

Vincenty's formula (applied above)
  • 763.444 miles
  • 1228.644 kilometers
  • 663.415 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 765.109 miles
  • 1231.323 kilometers
  • 664.861 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Al-Baha to Aqaba?

The estimated flight time from Al-Baha Domestic Airport to King Hussein International Airport is 1 hour and 56 minutes.

What is the time difference between Al-Baha and Aqaba?

There is no time difference between Al-Baha and Aqaba.

Flight carbon footprint between Al-Baha Domestic Airport (ABT) and King Hussein International Airport (AQJ)

On average, flying from Al-Baha to Aqaba generates about 131 kg of CO2 per passenger, and 131 kilograms equals 289 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Al-Baha to Aqaba

See the map of the shortest flight path between Al-Baha Domestic Airport (ABT) and King Hussein International Airport (AQJ).

Airport information

Origin Al-Baha Domestic Airport
City: Al-Baha
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: ABT
ICAO Code: OEBA
Coordinates: 20°17′45″N, 41°38′3″E
Destination King Hussein International Airport
City: Aqaba
Country: Jordan Flag of Jordan
IATA Code: AQJ
ICAO Code: OJAQ
Coordinates: 29°36′41″N, 35°1′5″E