Air Miles Calculator logo

How far is Seattle, WA, from Lopez, WA?

The distance between Lopez (Lopez Island Airport) and Seattle (Seattle–Tacoma International Airport) is 77 miles / 124 kilometers / 67 nautical miles.

The driving distance from Lopez (LPS) to Seattle (SEA) is 115 miles / 185 kilometers, and travel time by car is about 3 hours 8 minutes.

Lopez Island Airport – Seattle–Tacoma International Airport

Distance arrow
77
Miles
Distance arrow
124
Kilometers
Distance arrow
67
Nautical miles

Search flights

Distance from Lopez to Seattle

There are several ways to calculate the distance from Lopez to Seattle. Here are two standard methods:

Vincenty's formula (applied above)
  • 77.228 miles
  • 124.286 kilometers
  • 67.109 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 77.198 miles
  • 124.238 kilometers
  • 67.083 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lopez to Seattle?

The estimated flight time from Lopez Island Airport to Seattle–Tacoma International Airport is 38 minutes.

What is the time difference between Lopez and Seattle?

There is no time difference between Lopez and Seattle.

Flight carbon footprint between Lopez Island Airport (LPS) and Seattle–Tacoma International Airport (SEA)

On average, flying from Lopez to Seattle generates about 36 kg of CO2 per passenger, and 36 kilograms equals 80 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lopez to Seattle

See the map of the shortest flight path between Lopez Island Airport (LPS) and Seattle–Tacoma International Airport (SEA).

Airport information

Origin Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W
Destination Seattle–Tacoma International Airport
City: Seattle, WA
Country: United States Flag of United States
IATA Code: SEA
ICAO Code: KSEA
Coordinates: 47°26′56″N, 122°18′32″W