Air Miles Calculator logo

How far is Lopez, WA, from Lake Charles, LA?

The distance between Lake Charles (Lake Charles Regional Airport) and Lopez (Lopez Island Airport) is 2013 miles / 3239 kilometers / 1749 nautical miles.

The driving distance from Lake Charles (LCH) to Lopez (LPS) is 2528 miles / 4068 kilometers, and travel time by car is about 46 hours 13 minutes.

Lake Charles Regional Airport – Lopez Island Airport

Distance arrow
2013
Miles
Distance arrow
3239
Kilometers
Distance arrow
1749
Nautical miles

Search flights

Distance from Lake Charles to Lopez

There are several ways to calculate the distance from Lake Charles to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2012.758 miles
  • 3239.221 kilometers
  • 1749.039 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2011.026 miles
  • 3236.433 kilometers
  • 1747.534 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lake Charles to Lopez?

The estimated flight time from Lake Charles Regional Airport to Lopez Island Airport is 4 hours and 18 minutes.

Flight carbon footprint between Lake Charles Regional Airport (LCH) and Lopez Island Airport (LPS)

On average, flying from Lake Charles to Lopez generates about 219 kg of CO2 per passenger, and 219 kilograms equals 483 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lake Charles to Lopez

See the map of the shortest flight path between Lake Charles Regional Airport (LCH) and Lopez Island Airport (LPS).

Airport information

Origin Lake Charles Regional Airport
City: Lake Charles, LA
Country: United States Flag of United States
IATA Code: LCH
ICAO Code: KLCH
Coordinates: 30°7′33″N, 93°13′23″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W