Air Miles Calculator logo

How far is Lopez, WA, from Westerly, RI?

The distance between Westerly (Westerly State Airport) and Lopez (Lopez Island Airport) is 2508 miles / 4037 kilometers / 2180 nautical miles.

The driving distance from Westerly (WST) to Lopez (LPS) is 3068 miles / 4937 kilometers, and travel time by car is about 56 hours 31 minutes.

Westerly State Airport – Lopez Island Airport

Distance arrow
2508
Miles
Distance arrow
4037
Kilometers
Distance arrow
2180
Nautical miles

Search flights

Distance from Westerly to Lopez

There are several ways to calculate the distance from Westerly to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2508.197 miles
  • 4036.552 kilometers
  • 2179.564 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2501.553 miles
  • 4025.859 kilometers
  • 2173.790 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Westerly to Lopez?

The estimated flight time from Westerly State Airport to Lopez Island Airport is 5 hours and 14 minutes.

Flight carbon footprint between Westerly State Airport (WST) and Lopez Island Airport (LPS)

On average, flying from Westerly to Lopez generates about 276 kg of CO2 per passenger, and 276 kilograms equals 609 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Westerly to Lopez

See the map of the shortest flight path between Westerly State Airport (WST) and Lopez Island Airport (LPS).

Airport information

Origin Westerly State Airport
City: Westerly, RI
Country: United States Flag of United States
IATA Code: WST
ICAO Code: KWST
Coordinates: 41°20′58″N, 71°48′12″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W