Air Miles Calculator logo

How far is Bayda from Benghazi?

The distance between Benghazi (Benina International Airport) and Bayda (Al Abraq International Airport) is 110 miles / 177 kilometers / 96 nautical miles.

The driving distance from Benghazi (BEN) to Bayda (LAQ) is 5027 miles / 8090 kilometers, and travel time by car is about 123 hours 31 minutes.

Benina International Airport – Al Abraq International Airport

Distance arrow
110
Miles
Distance arrow
177
Kilometers
Distance arrow
96
Nautical miles

Search flights

Distance from Benghazi to Bayda

There are several ways to calculate the distance from Benghazi to Bayda. Here are two standard methods:

Vincenty's formula (applied above)
  • 109.906 miles
  • 176.877 kilometers
  • 95.506 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 109.777 miles
  • 176.669 kilometers
  • 95.394 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Benghazi to Bayda?

The estimated flight time from Benina International Airport to Al Abraq International Airport is 42 minutes.

What is the time difference between Benghazi and Bayda?

There is no time difference between Benghazi and Bayda.

Flight carbon footprint between Benina International Airport (BEN) and Al Abraq International Airport (LAQ)

On average, flying from Benghazi to Bayda generates about 41 kg of CO2 per passenger, and 41 kilograms equals 91 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Benghazi to Bayda

See the map of the shortest flight path between Benina International Airport (BEN) and Al Abraq International Airport (LAQ).

Airport information

Origin Benina International Airport
City: Benghazi
Country: Libya Flag of Libya
IATA Code: BEN
ICAO Code: HLLB
Coordinates: 32°5′48″N, 20°16′10″E
Destination Al Abraq International Airport
City: Bayda
Country: Libya Flag of Libya
IATA Code: LAQ
ICAO Code: HLLQ
Coordinates: 32°47′19″N, 21°57′51″E