Air Miles Calculator logo

How far is Kithira from Bayda?

The distance between Bayda (Al Abraq International Airport) and Kithira (Kithira Island National Airport) is 248 miles / 399 kilometers / 215 nautical miles.

Al Abraq International Airport – Kithira Island National Airport

Distance arrow
248
Miles
Distance arrow
399
Kilometers
Distance arrow
215
Nautical miles

Search flights

Distance from Bayda to Kithira

There are several ways to calculate the distance from Bayda to Kithira. Here are two standard methods:

Vincenty's formula (applied above)
  • 247.648 miles
  • 398.551 kilometers
  • 215.200 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 248.169 miles
  • 399.389 kilometers
  • 215.653 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bayda to Kithira?

The estimated flight time from Al Abraq International Airport to Kithira Island National Airport is 58 minutes.

What is the time difference between Bayda and Kithira?

There is no time difference between Bayda and Kithira.

Flight carbon footprint between Al Abraq International Airport (LAQ) and Kithira Island National Airport (KIT)

On average, flying from Bayda to Kithira generates about 61 kg of CO2 per passenger, and 61 kilograms equals 136 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bayda to Kithira

See the map of the shortest flight path between Al Abraq International Airport (LAQ) and Kithira Island National Airport (KIT).

Airport information

Origin Al Abraq International Airport
City: Bayda
Country: Libya Flag of Libya
IATA Code: LAQ
ICAO Code: HLLQ
Coordinates: 32°47′19″N, 21°57′51″E
Destination Kithira Island National Airport
City: Kithira
Country: Greece Flag of Greece
IATA Code: KIT
ICAO Code: LGKC
Coordinates: 36°16′27″N, 23°1′1″E