Air Miles Calculator logo

How far is Lopez, WA, from Dubois, PA?

The distance between Dubois (DuBois Regional Airport) and Lopez (Lopez Island Airport) is 2190 miles / 3524 kilometers / 1903 nautical miles.

The driving distance from Dubois (DUJ) to Lopez (LPS) is 2641 miles / 4251 kilometers, and travel time by car is about 48 hours 11 minutes.

DuBois Regional Airport – Lopez Island Airport

Distance arrow
2190
Miles
Distance arrow
3524
Kilometers
Distance arrow
1903
Nautical miles

Search flights

Distance from Dubois to Lopez

There are several ways to calculate the distance from Dubois to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2189.799 miles
  • 3524.140 kilometers
  • 1902.883 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2184.115 miles
  • 3514.992 kilometers
  • 1897.944 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Dubois to Lopez?

The estimated flight time from DuBois Regional Airport to Lopez Island Airport is 4 hours and 38 minutes.

Flight carbon footprint between DuBois Regional Airport (DUJ) and Lopez Island Airport (LPS)

On average, flying from Dubois to Lopez generates about 239 kg of CO2 per passenger, and 239 kilograms equals 527 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Dubois to Lopez

See the map of the shortest flight path between DuBois Regional Airport (DUJ) and Lopez Island Airport (LPS).

Airport information

Origin DuBois Regional Airport
City: Dubois, PA
Country: United States Flag of United States
IATA Code: DUJ
ICAO Code: KDUJ
Coordinates: 41°10′41″N, 78°53′55″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W