Air Miles Calculator logo

How far is Hyannis, MA, from Lopez, WA?

The distance between Lopez (Lopez Island Airport) and Hyannis (Cape Cod Gateway Airport) is 2567 miles / 4131 kilometers / 2230 nautical miles.

The driving distance from Lopez (LPS) to Hyannis (HYA) is 3088 miles / 4970 kilometers, and travel time by car is about 58 hours 14 minutes.

Lopez Island Airport – Cape Cod Gateway Airport

Distance arrow
2567
Miles
Distance arrow
4131
Kilometers
Distance arrow
2230
Nautical miles

Search flights

Distance from Lopez to Hyannis

There are several ways to calculate the distance from Lopez to Hyannis. Here are two standard methods:

Vincenty's formula (applied above)
  • 2566.586 miles
  • 4130.520 kilometers
  • 2230.302 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2559.718 miles
  • 4119.467 kilometers
  • 2224.334 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lopez to Hyannis?

The estimated flight time from Lopez Island Airport to Cape Cod Gateway Airport is 5 hours and 21 minutes.

Flight carbon footprint between Lopez Island Airport (LPS) and Cape Cod Gateway Airport (HYA)

On average, flying from Lopez to Hyannis generates about 283 kg of CO2 per passenger, and 283 kilograms equals 624 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lopez to Hyannis

See the map of the shortest flight path between Lopez Island Airport (LPS) and Cape Cod Gateway Airport (HYA).

Airport information

Origin Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W
Destination Cape Cod Gateway Airport
City: Hyannis, MA
Country: United States Flag of United States
IATA Code: HYA
ICAO Code: KHYA
Coordinates: 41°40′9″N, 70°16′49″W