Air Miles Calculator logo

How far is Sirte from Kraków?

The distance between Kraków (Kraków John Paul II International Airport) and Sirte (Ghardabiya Airbase) is 1322 miles / 2128 kilometers / 1149 nautical miles.

Kraków John Paul II International Airport – Ghardabiya Airbase

Distance arrow
1322
Miles
Distance arrow
2128
Kilometers
Distance arrow
1149
Nautical miles

Search flights

Distance from Kraków to Sirte

There are several ways to calculate the distance from Kraków to Sirte. Here are two standard methods:

Vincenty's formula (applied above)
  • 1322.385 miles
  • 2128.173 kilometers
  • 1149.121 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1324.066 miles
  • 2130.878 kilometers
  • 1150.582 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kraków to Sirte?

The estimated flight time from Kraków John Paul II International Airport to Ghardabiya Airbase is 3 hours and 0 minutes.

What is the time difference between Kraków and Sirte?

There is no time difference between Kraków and Sirte.

Flight carbon footprint between Kraków John Paul II International Airport (KRK) and Ghardabiya Airbase (SRX)

On average, flying from Kraków to Sirte generates about 168 kg of CO2 per passenger, and 168 kilograms equals 371 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kraków to Sirte

See the map of the shortest flight path between Kraków John Paul II International Airport (KRK) and Ghardabiya Airbase (SRX).

Airport information

Origin Kraków John Paul II International Airport
City: Kraków
Country: Poland Flag of Poland
IATA Code: KRK
ICAO Code: EPKK
Coordinates: 50°4′39″N, 19°47′5″E
Destination Ghardabiya Airbase
City: Sirte
Country: Libya Flag of Libya
IATA Code: SRX
ICAO Code: HLGD
Coordinates: 31°3′48″N, 16°35′41″E