Air Miles Calculator logo

How far is Lopez, WA, from Augusta, GA?

The distance between Augusta (Augusta Regional Airport) and Lopez (Lopez Island Airport) is 2348 miles / 3779 kilometers / 2040 nautical miles.

The driving distance from Augusta (AGS) to Lopez (LPS) is 2911 miles / 4685 kilometers, and travel time by car is about 52 hours 28 minutes.

Augusta Regional Airport – Lopez Island Airport

Distance arrow
2348
Miles
Distance arrow
3779
Kilometers
Distance arrow
2040
Nautical miles

Search flights

Distance from Augusta to Lopez

There are several ways to calculate the distance from Augusta to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2347.996 miles
  • 3778.734 kilometers
  • 2040.353 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2343.807 miles
  • 3771.991 kilometers
  • 2036.712 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Augusta to Lopez?

The estimated flight time from Augusta Regional Airport to Lopez Island Airport is 4 hours and 56 minutes.

Flight carbon footprint between Augusta Regional Airport (AGS) and Lopez Island Airport (LPS)

On average, flying from Augusta to Lopez generates about 257 kg of CO2 per passenger, and 257 kilograms equals 568 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Augusta to Lopez

See the map of the shortest flight path between Augusta Regional Airport (AGS) and Lopez Island Airport (LPS).

Airport information

Origin Augusta Regional Airport
City: Augusta, GA
Country: United States Flag of United States
IATA Code: AGS
ICAO Code: KAGS
Coordinates: 33°22′11″N, 81°57′52″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W