Air Miles Calculator logo

How far is Pescara from Belfast?

The distance between Belfast (Belfast International Airport) and Pescara (Abruzzo Airport) is 1252 miles / 2015 kilometers / 1088 nautical miles.

The driving distance from Belfast (BFS) to Pescara (PSR) is 1555 miles / 2502 kilometers, and travel time by car is about 29 hours 38 minutes.

Belfast International Airport – Abruzzo Airport

Distance arrow
1252
Miles
Distance arrow
2015
Kilometers
Distance arrow
1088
Nautical miles

Search flights

Distance from Belfast to Pescara

There are several ways to calculate the distance from Belfast to Pescara. Here are two standard methods:

Vincenty's formula (applied above)
  • 1252.202 miles
  • 2015.224 kilometers
  • 1088.134 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1250.142 miles
  • 2011.908 kilometers
  • 1086.344 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Belfast to Pescara?

The estimated flight time from Belfast International Airport to Abruzzo Airport is 2 hours and 52 minutes.

Flight carbon footprint between Belfast International Airport (BFS) and Abruzzo Airport (PSR)

On average, flying from Belfast to Pescara generates about 164 kg of CO2 per passenger, and 164 kilograms equals 361 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Belfast to Pescara

See the map of the shortest flight path between Belfast International Airport (BFS) and Abruzzo Airport (PSR).

Airport information

Origin Belfast International Airport
City: Belfast
Country: United Kingdom Flag of United Kingdom
IATA Code: BFS
ICAO Code: EGAA
Coordinates: 54°39′27″N, 6°12′56″W
Destination Abruzzo Airport
City: Pescara
Country: Italy Flag of Italy
IATA Code: PSR
ICAO Code: LIBP
Coordinates: 42°25′54″N, 14°10′51″E