Air Miles Calculator logo

How far is Al-Baha from Al-Ubayyid?

The distance between Al-Ubayyid (El Obeid Airport) and Al-Baha (Al-Baha Domestic Airport) is 901 miles / 1449 kilometers / 783 nautical miles.

El Obeid Airport – Al-Baha Domestic Airport

Distance arrow
901
Miles
Distance arrow
1449
Kilometers
Distance arrow
783
Nautical miles

Search flights

Distance from Al-Ubayyid to Al-Baha

There are several ways to calculate the distance from Al-Ubayyid to Al-Baha. Here are two standard methods:

Vincenty's formula (applied above)
  • 900.532 miles
  • 1449.266 kilometers
  • 782.541 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 900.929 miles
  • 1449.904 kilometers
  • 782.886 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Al-Ubayyid to Al-Baha?

The estimated flight time from El Obeid Airport to Al-Baha Domestic Airport is 2 hours and 12 minutes.

Flight carbon footprint between El Obeid Airport (EBD) and Al-Baha Domestic Airport (ABT)

On average, flying from Al-Ubayyid to Al-Baha generates about 144 kg of CO2 per passenger, and 144 kilograms equals 317 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Al-Ubayyid to Al-Baha

See the map of the shortest flight path between El Obeid Airport (EBD) and Al-Baha Domestic Airport (ABT).

Airport information

Origin El Obeid Airport
City: Al-Ubayyid
Country: Sudan Flag of Sudan
IATA Code: EBD
ICAO Code: HSOB
Coordinates: 13°9′11″N, 30°13′57″E
Destination Al-Baha Domestic Airport
City: Al-Baha
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: ABT
ICAO Code: OEBA
Coordinates: 20°17′45″N, 41°38′3″E