Air Miles Calculator logo

How far is Lopez, WA, from Baton Rouge, LA?

The distance between Baton Rouge (Baton Rouge Metropolitan Airport) and Lopez (Lopez Island Airport) is 2078 miles / 3344 kilometers / 1805 nautical miles.

The driving distance from Baton Rouge (BTR) to Lopez (LPS) is 2615 miles / 4209 kilometers, and travel time by car is about 47 hours 25 minutes.

Baton Rouge Metropolitan Airport – Lopez Island Airport

Distance arrow
2078
Miles
Distance arrow
3344
Kilometers
Distance arrow
1805
Nautical miles

Search flights

Distance from Baton Rouge to Lopez

There are several ways to calculate the distance from Baton Rouge to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2077.671 miles
  • 3343.688 kilometers
  • 1805.447 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2075.506 miles
  • 3340.202 kilometers
  • 1803.565 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baton Rouge to Lopez?

The estimated flight time from Baton Rouge Metropolitan Airport to Lopez Island Airport is 4 hours and 26 minutes.

Flight carbon footprint between Baton Rouge Metropolitan Airport (BTR) and Lopez Island Airport (LPS)

On average, flying from Baton Rouge to Lopez generates about 226 kg of CO2 per passenger, and 226 kilograms equals 499 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Baton Rouge to Lopez

See the map of the shortest flight path between Baton Rouge Metropolitan Airport (BTR) and Lopez Island Airport (LPS).

Airport information

Origin Baton Rouge Metropolitan Airport
City: Baton Rouge, LA
Country: United States Flag of United States
IATA Code: BTR
ICAO Code: KBTR
Coordinates: 30°31′59″N, 91°8′58″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W