Air Miles Calculator logo

How far is Pescara from Lublin?

The distance between Lublin (Lublin Airport) and Pescara (Abruzzo Airport) is 730 miles / 1174 kilometers / 634 nautical miles.

The driving distance from Lublin (LUZ) to Pescara (PSR) is 1189 miles / 1913 kilometers, and travel time by car is about 20 hours 3 minutes.

Lublin Airport – Abruzzo Airport

Distance arrow
730
Miles
Distance arrow
1174
Kilometers
Distance arrow
634
Nautical miles

Search flights

Distance from Lublin to Pescara

There are several ways to calculate the distance from Lublin to Pescara. Here are two standard methods:

Vincenty's formula (applied above)
  • 729.546 miles
  • 1174.091 kilometers
  • 633.958 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 729.031 miles
  • 1173.262 kilometers
  • 633.511 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lublin to Pescara?

The estimated flight time from Lublin Airport to Abruzzo Airport is 1 hour and 52 minutes.

What is the time difference between Lublin and Pescara?

There is no time difference between Lublin and Pescara.

Flight carbon footprint between Lublin Airport (LUZ) and Abruzzo Airport (PSR)

On average, flying from Lublin to Pescara generates about 128 kg of CO2 per passenger, and 128 kilograms equals 282 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lublin to Pescara

See the map of the shortest flight path between Lublin Airport (LUZ) and Abruzzo Airport (PSR).

Airport information

Origin Lublin Airport
City: Lublin
Country: Poland Flag of Poland
IATA Code: LUZ
ICAO Code: EPLB
Coordinates: 51°14′25″N, 22°42′48″E
Destination Abruzzo Airport
City: Pescara
Country: Italy Flag of Italy
IATA Code: PSR
ICAO Code: LIBP
Coordinates: 42°25′54″N, 14°10′51″E