Air Miles Calculator logo

How far is Pescara from Cluj-Napoca?

The distance between Cluj-Napoca (Cluj International Airport) and Pescara (Abruzzo Airport) is 556 miles / 895 kilometers / 483 nautical miles.

The driving distance from Cluj-Napoca (CLJ) to Pescara (PSR) is 1043 miles / 1679 kilometers, and travel time by car is about 18 hours 50 minutes.

Cluj International Airport – Abruzzo Airport

Distance arrow
556
Miles
Distance arrow
895
Kilometers
Distance arrow
483
Nautical miles

Search flights

Distance from Cluj-Napoca to Pescara

There are several ways to calculate the distance from Cluj-Napoca to Pescara. Here are two standard methods:

Vincenty's formula (applied above)
  • 556.342 miles
  • 895.345 kilometers
  • 483.448 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 555.356 miles
  • 893.759 kilometers
  • 482.591 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cluj-Napoca to Pescara?

The estimated flight time from Cluj International Airport to Abruzzo Airport is 1 hour and 33 minutes.

Flight carbon footprint between Cluj International Airport (CLJ) and Abruzzo Airport (PSR)

On average, flying from Cluj-Napoca to Pescara generates about 107 kg of CO2 per passenger, and 107 kilograms equals 236 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Cluj-Napoca to Pescara

See the map of the shortest flight path between Cluj International Airport (CLJ) and Abruzzo Airport (PSR).

Airport information

Origin Cluj International Airport
City: Cluj-Napoca
Country: Romania Flag of Romania
IATA Code: CLJ
ICAO Code: LRCL
Coordinates: 46°47′6″N, 23°41′10″E
Destination Abruzzo Airport
City: Pescara
Country: Italy Flag of Italy
IATA Code: PSR
ICAO Code: LIBP
Coordinates: 42°25′54″N, 14°10′51″E