Air Miles Calculator logo

How far is Patras from Olbia?

The distance between Olbia (Olbia Costa Smeralda Airport) and Patras (Patras Araxos Airport) is 663 miles / 1067 kilometers / 576 nautical miles.

The driving distance from Olbia (OLB) to Patras (GPA) is 938 miles / 1509 kilometers, and travel time by car is about 28 hours 5 minutes.

Olbia Costa Smeralda Airport – Patras Araxos Airport

Distance arrow
663
Miles
Distance arrow
1067
Kilometers
Distance arrow
576
Nautical miles

Search flights

Distance from Olbia to Patras

There are several ways to calculate the distance from Olbia to Patras. Here are two standard methods:

Vincenty's formula (applied above)
  • 663.223 miles
  • 1067.354 kilometers
  • 576.325 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 661.800 miles
  • 1065.064 kilometers
  • 575.089 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Olbia to Patras?

The estimated flight time from Olbia Costa Smeralda Airport to Patras Araxos Airport is 1 hour and 45 minutes.

Flight carbon footprint between Olbia Costa Smeralda Airport (OLB) and Patras Araxos Airport (GPA)

On average, flying from Olbia to Patras generates about 120 kg of CO2 per passenger, and 120 kilograms equals 265 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Olbia to Patras

See the map of the shortest flight path between Olbia Costa Smeralda Airport (OLB) and Patras Araxos Airport (GPA).

Airport information

Origin Olbia Costa Smeralda Airport
City: Olbia
Country: Italy Flag of Italy
IATA Code: OLB
ICAO Code: LIEO
Coordinates: 40°53′55″N, 9°31′3″E
Destination Patras Araxos Airport
City: Patras
Country: Greece Flag of Greece
IATA Code: GPA
ICAO Code: LGRX
Coordinates: 38°9′3″N, 21°25′32″E