Air Miles Calculator logo

How far is Lopez, WA, from Liberia?

The distance between Liberia (Daniel Oduber Quirós International Airport) and Lopez (Lopez Island Airport) is 3386 miles / 5450 kilometers / 2943 nautical miles.

The driving distance from Liberia (LIR) to Lopez (LPS) is 4332 miles / 6971 kilometers, and travel time by car is about 85 hours 58 minutes.

Daniel Oduber Quirós International Airport – Lopez Island Airport

Distance arrow
3386
Miles
Distance arrow
5450
Kilometers
Distance arrow
2943
Nautical miles

Search flights

Distance from Liberia to Lopez

There are several ways to calculate the distance from Liberia to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 3386.300 miles
  • 5449.721 kilometers
  • 2942.614 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3389.645 miles
  • 5455.105 kilometers
  • 2945.521 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Liberia to Lopez?

The estimated flight time from Daniel Oduber Quirós International Airport to Lopez Island Airport is 6 hours and 54 minutes.

Flight carbon footprint between Daniel Oduber Quirós International Airport (LIR) and Lopez Island Airport (LPS)

On average, flying from Liberia to Lopez generates about 381 kg of CO2 per passenger, and 381 kilograms equals 839 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Liberia to Lopez

See the map of the shortest flight path between Daniel Oduber Quirós International Airport (LIR) and Lopez Island Airport (LPS).

Airport information

Origin Daniel Oduber Quirós International Airport
City: Liberia
Country: Costa Rica Flag of Costa Rica
IATA Code: LIR
ICAO Code: MRLB
Coordinates: 10°35′35″N, 85°32′39″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W