Air Miles Calculator logo

How far is Benghazi from Bayda?

The distance between Bayda (Al Abraq International Airport) and Benghazi (Benina International Airport) is 110 miles / 177 kilometers / 96 nautical miles.

The driving distance from Bayda (LAQ) to Benghazi (BEN) is 5026 miles / 8089 kilometers, and travel time by car is about 123 hours 19 minutes.

Al Abraq International Airport – Benina International Airport

Distance arrow
110
Miles
Distance arrow
177
Kilometers
Distance arrow
96
Nautical miles

Search flights

Distance from Bayda to Benghazi

There are several ways to calculate the distance from Bayda to Benghazi. Here are two standard methods:

Vincenty's formula (applied above)
  • 109.906 miles
  • 176.877 kilometers
  • 95.506 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 109.777 miles
  • 176.669 kilometers
  • 95.394 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bayda to Benghazi?

The estimated flight time from Al Abraq International Airport to Benina International Airport is 42 minutes.

What is the time difference between Bayda and Benghazi?

There is no time difference between Bayda and Benghazi.

Flight carbon footprint between Al Abraq International Airport (LAQ) and Benina International Airport (BEN)

On average, flying from Bayda to Benghazi generates about 41 kg of CO2 per passenger, and 41 kilograms equals 91 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bayda to Benghazi

See the map of the shortest flight path between Al Abraq International Airport (LAQ) and Benina International Airport (BEN).

Airport information

Origin Al Abraq International Airport
City: Bayda
Country: Libya Flag of Libya
IATA Code: LAQ
ICAO Code: HLLQ
Coordinates: 32°47′19″N, 21°57′51″E
Destination Benina International Airport
City: Benghazi
Country: Libya Flag of Libya
IATA Code: BEN
ICAO Code: HLLB
Coordinates: 32°5′48″N, 20°16′10″E