Air Miles Calculator logo

How far is Lopez, WA, from Muscle Shoals, AL?

The distance between Muscle Shoals (Northwest Alabama Regional Airport) and Lopez (Lopez Island Airport) is 2036 miles / 3276 kilometers / 1769 nautical miles.

The driving distance from Muscle Shoals (MSL) to Lopez (LPS) is 2574 miles / 4143 kilometers, and travel time by car is about 46 hours 23 minutes.

Northwest Alabama Regional Airport – Lopez Island Airport

Distance arrow
2036
Miles
Distance arrow
3276
Kilometers
Distance arrow
1769
Nautical miles

Search flights

Distance from Muscle Shoals to Lopez

There are several ways to calculate the distance from Muscle Shoals to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2035.888 miles
  • 3276.443 kilometers
  • 1769.138 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2032.288 miles
  • 3270.650 kilometers
  • 1766.010 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Muscle Shoals to Lopez?

The estimated flight time from Northwest Alabama Regional Airport to Lopez Island Airport is 4 hours and 21 minutes.

Flight carbon footprint between Northwest Alabama Regional Airport (MSL) and Lopez Island Airport (LPS)

On average, flying from Muscle Shoals to Lopez generates about 222 kg of CO2 per passenger, and 222 kilograms equals 488 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Muscle Shoals to Lopez

See the map of the shortest flight path between Northwest Alabama Regional Airport (MSL) and Lopez Island Airport (LPS).

Airport information

Origin Northwest Alabama Regional Airport
City: Muscle Shoals, AL
Country: United States Flag of United States
IATA Code: MSL
ICAO Code: KMSL
Coordinates: 34°44′43″N, 87°36′36″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W