Air Miles Calculator logo

How far is Lopez, WA, from St George, UT?

The distance between St George (St. George Municipal Airport) and Lopez (Lopez Island Airport) is 923 miles / 1485 kilometers / 802 nautical miles.

The driving distance from St George (SGU) to Lopez (LPS) is 1233 miles / 1984 kilometers, and travel time by car is about 22 hours 22 minutes.

St. George Municipal Airport – Lopez Island Airport

Distance arrow
923
Miles
Distance arrow
1485
Kilometers
Distance arrow
802
Nautical miles

Search flights

Distance from St George to Lopez

There are several ways to calculate the distance from St George to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 922.696 miles
  • 1484.936 kilometers
  • 801.801 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 922.695 miles
  • 1484.934 kilometers
  • 801.800 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St George to Lopez?

The estimated flight time from St. George Municipal Airport to Lopez Island Airport is 2 hours and 14 minutes.

Flight carbon footprint between St. George Municipal Airport (SGU) and Lopez Island Airport (LPS)

On average, flying from St George to Lopez generates about 145 kg of CO2 per passenger, and 145 kilograms equals 320 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from St George to Lopez

See the map of the shortest flight path between St. George Municipal Airport (SGU) and Lopez Island Airport (LPS).

Airport information

Origin St. George Municipal Airport
City: St George, UT
Country: United States Flag of United States
IATA Code: SGU
ICAO Code: KSGU
Coordinates: 37°2′11″N, 113°30′37″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W