Air Miles Calculator logo

How far is Lopez, WA, from Columbia, MO?

The distance between Columbia (Columbia Regional Airport) and Lopez (Lopez Island Airport) is 1664 miles / 2678 kilometers / 1446 nautical miles.

The driving distance from Columbia (COU) to Lopez (LPS) is 2097 miles / 3375 kilometers, and travel time by car is about 37 hours 21 minutes.

Columbia Regional Airport – Lopez Island Airport

Distance arrow
1664
Miles
Distance arrow
2678
Kilometers
Distance arrow
1446
Nautical miles

Search flights

Distance from Columbia to Lopez

There are several ways to calculate the distance from Columbia to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1663.784 miles
  • 2677.601 kilometers
  • 1445.789 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1660.225 miles
  • 2671.874 kilometers
  • 1442.696 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Columbia to Lopez?

The estimated flight time from Columbia Regional Airport to Lopez Island Airport is 3 hours and 39 minutes.

Flight carbon footprint between Columbia Regional Airport (COU) and Lopez Island Airport (LPS)

On average, flying from Columbia to Lopez generates about 190 kg of CO2 per passenger, and 190 kilograms equals 419 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Columbia to Lopez

See the map of the shortest flight path between Columbia Regional Airport (COU) and Lopez Island Airport (LPS).

Airport information

Origin Columbia Regional Airport
City: Columbia, MO
Country: United States Flag of United States
IATA Code: COU
ICAO Code: KCOU
Coordinates: 38°49′5″N, 92°13′10″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W