Air Miles Calculator logo

How far is Lopez, WA, from Salisbury, MD?

The distance between Salisbury (Salisbury–Ocean City–Wicomico Regional Airport) and Lopez (Lopez Island Airport) is 2444 miles / 3934 kilometers / 2124 nautical miles.

The driving distance from Salisbury (SBY) to Lopez (LPS) is 2946 miles / 4741 kilometers, and travel time by car is about 54 hours 23 minutes.

Salisbury–Ocean City–Wicomico Regional Airport – Lopez Island Airport

Distance arrow
2444
Miles
Distance arrow
3934
Kilometers
Distance arrow
2124
Nautical miles

Search flights

Distance from Salisbury to Lopez

There are several ways to calculate the distance from Salisbury to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2444.443 miles
  • 3933.949 kilometers
  • 2124.163 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2438.582 miles
  • 3924.518 kilometers
  • 2119.070 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Salisbury to Lopez?

The estimated flight time from Salisbury–Ocean City–Wicomico Regional Airport to Lopez Island Airport is 5 hours and 7 minutes.

Flight carbon footprint between Salisbury–Ocean City–Wicomico Regional Airport (SBY) and Lopez Island Airport (LPS)

On average, flying from Salisbury to Lopez generates about 269 kg of CO2 per passenger, and 269 kilograms equals 592 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Salisbury to Lopez

See the map of the shortest flight path between Salisbury–Ocean City–Wicomico Regional Airport (SBY) and Lopez Island Airport (LPS).

Airport information

Origin Salisbury–Ocean City–Wicomico Regional Airport
City: Salisbury, MD
Country: United States Flag of United States
IATA Code: SBY
ICAO Code: KSBY
Coordinates: 38°20′25″N, 75°30′37″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W