Air Miles Calculator logo

How far is Lopez, WA, from Hyannis, MA?

The distance between Hyannis (Cape Cod Gateway Airport) and Lopez (Lopez Island Airport) is 2567 miles / 4131 kilometers / 2230 nautical miles.

The driving distance from Hyannis (HYA) to Lopez (LPS) is 3089 miles / 4972 kilometers, and travel time by car is about 58 hours 15 minutes.

Cape Cod Gateway Airport – Lopez Island Airport

Distance arrow
2567
Miles
Distance arrow
4131
Kilometers
Distance arrow
2230
Nautical miles

Search flights

Distance from Hyannis to Lopez

There are several ways to calculate the distance from Hyannis to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2566.586 miles
  • 4130.520 kilometers
  • 2230.302 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2559.718 miles
  • 4119.467 kilometers
  • 2224.334 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hyannis to Lopez?

The estimated flight time from Cape Cod Gateway Airport to Lopez Island Airport is 5 hours and 21 minutes.

Flight carbon footprint between Cape Cod Gateway Airport (HYA) and Lopez Island Airport (LPS)

On average, flying from Hyannis to Lopez generates about 283 kg of CO2 per passenger, and 283 kilograms equals 624 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hyannis to Lopez

See the map of the shortest flight path between Cape Cod Gateway Airport (HYA) and Lopez Island Airport (LPS).

Airport information

Origin Cape Cod Gateway Airport
City: Hyannis, MA
Country: United States Flag of United States
IATA Code: HYA
ICAO Code: KHYA
Coordinates: 41°40′9″N, 70°16′49″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W