Air Miles Calculator logo

How far is Lopez, WA, from Richmond, VA?

The distance between Richmond (Richmond International Airport) and Lopez (Lopez Island Airport) is 2393 miles / 3850 kilometers / 2079 nautical miles.

The driving distance from Richmond (RIC) to Lopez (LPS) is 2949 miles / 4746 kilometers, and travel time by car is about 53 hours 59 minutes.

Richmond International Airport – Lopez Island Airport

Distance arrow
2393
Miles
Distance arrow
3850
Kilometers
Distance arrow
2079
Nautical miles

Search flights

Distance from Richmond to Lopez

There are several ways to calculate the distance from Richmond to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2392.528 miles
  • 3850.401 kilometers
  • 2079.050 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2387.019 miles
  • 3841.534 kilometers
  • 2074.263 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Richmond to Lopez?

The estimated flight time from Richmond International Airport to Lopez Island Airport is 5 hours and 1 minutes.

Flight carbon footprint between Richmond International Airport (RIC) and Lopez Island Airport (LPS)

On average, flying from Richmond to Lopez generates about 263 kg of CO2 per passenger, and 263 kilograms equals 579 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Richmond to Lopez

See the map of the shortest flight path between Richmond International Airport (RIC) and Lopez Island Airport (LPS).

Airport information

Origin Richmond International Airport
City: Richmond, VA
Country: United States Flag of United States
IATA Code: RIC
ICAO Code: KRIC
Coordinates: 37°30′18″N, 77°19′10″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W