Air Miles Calculator logo

How far is Tripoli from Benghazi?

Distance between Benghazi (Benina International Airport) and Tripoli (Mitiga International Airport) is 412 miles / 663 kilometers / 358 nautical miles. Estimated flight time is 1 hour 16 minutes.

Driving distance from Benghazi (BEN) to Tripoli (MJI) is 684 miles / 1101 kilometers and travel time by car is about 21 hours 3 minutes.

Benina International Airport – Mitiga International Airport

Distance arrow
412
Miles
Distance arrow
663
Kilometers
Distance arrow
358
Nautical miles

Distance from Benghazi to Tripoli

There are several ways to calculate distance from Benghazi to Tripoli. Here are two common methods:

Vincenty's formula (applied above)
  • 412.001 miles
  • 663.052 kilometers
  • 358.019 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
  • 411.178 miles
  • 661.727 kilometers
  • 357.304 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Benghazi to Tripoli?

Estimated flight time from Benina International Airport to Mitiga International Airport is 1 hour 16 minutes.

What is the time difference between Benghazi and Tripoli?

There is no time difference between Benghazi and Tripoli.

Flight carbon footprint between Benina International Airport (BEN) and Mitiga International Airport (MJI)

On average flying from Benghazi to Tripoli generates about 86 kg of CO2 per passenger, 86 kilograms is equal to 189 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Benghazi to Tripoli

Shortest flight path between Benina International Airport (BEN) and Mitiga International Airport (MJI).

Airport information

Origin Benina International Airport
City: Benghazi
Country: Libya Flag of Libya
IATA Code: BEN
ICAO Code: HLLB
Coordinates: 32°5′48″N, 20°16′10″E
Destination Mitiga International Airport
City: Tripoli
Country: Libya Flag of Libya
IATA Code: MJI
ICAO Code: HLLM
Coordinates: 32°53′38″N, 13°16′33″E