Air Miles Calculator logo

How far is Bayda from Tripoli?

The distance between Tripoli (Mitiga International Airport) and Bayda (Al Abraq International Airport) is 505 miles / 813 kilometers / 439 nautical miles.

The driving distance from Tripoli (MJI) to Bayda (LAQ) is 4350 miles / 7001 kilometers, and travel time by car is about 102 hours 35 minutes.

Mitiga International Airport – Al Abraq International Airport

Distance arrow
505
Miles
Distance arrow
813
Kilometers
Distance arrow
439
Nautical miles

Search flights

Distance from Tripoli to Bayda

There are several ways to calculate the distance from Tripoli to Bayda. Here are two standard methods:

Vincenty's formula (applied above)
  • 505.332 miles
  • 813.253 kilometers
  • 439.122 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 504.271 miles
  • 811.545 kilometers
  • 438.199 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tripoli to Bayda?

The estimated flight time from Mitiga International Airport to Al Abraq International Airport is 1 hour and 27 minutes.

What is the time difference between Tripoli and Bayda?

There is no time difference between Tripoli and Bayda.

Flight carbon footprint between Mitiga International Airport (MJI) and Al Abraq International Airport (LAQ)

On average, flying from Tripoli to Bayda generates about 99 kg of CO2 per passenger, and 99 kilograms equals 219 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tripoli to Bayda

See the map of the shortest flight path between Mitiga International Airport (MJI) and Al Abraq International Airport (LAQ).

Airport information

Origin Mitiga International Airport
City: Tripoli
Country: Libya Flag of Libya
IATA Code: MJI
ICAO Code: HLLM
Coordinates: 32°53′38″N, 13°16′33″E
Destination Al Abraq International Airport
City: Bayda
Country: Libya Flag of Libya
IATA Code: LAQ
ICAO Code: HLLQ
Coordinates: 32°47′19″N, 21°57′51″E