Air Miles Calculator logo

How far is Lopez, WA, from Waskaganish?

The distance between Waskaganish (Waskaganish Airport) and Lopez (Lopez Island Airport) is 1950 miles / 3138 kilometers / 1694 nautical miles.

The driving distance from Waskaganish (YKQ) to Lopez (LPS) is 2886 miles / 4645 kilometers, and travel time by car is about 58 hours 56 minutes.

Waskaganish Airport – Lopez Island Airport

Distance arrow
1950
Miles
Distance arrow
3138
Kilometers
Distance arrow
1694
Nautical miles

Search flights

Distance from Waskaganish to Lopez

There are several ways to calculate the distance from Waskaganish to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1949.675 miles
  • 3137.698 kilometers
  • 1694.221 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1943.734 miles
  • 3128.137 kilometers
  • 1689.059 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Waskaganish to Lopez?

The estimated flight time from Waskaganish Airport to Lopez Island Airport is 4 hours and 11 minutes.

Flight carbon footprint between Waskaganish Airport (YKQ) and Lopez Island Airport (LPS)

On average, flying from Waskaganish to Lopez generates about 213 kg of CO2 per passenger, and 213 kilograms equals 469 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Waskaganish to Lopez

See the map of the shortest flight path between Waskaganish Airport (YKQ) and Lopez Island Airport (LPS).

Airport information

Origin Waskaganish Airport
City: Waskaganish
Country: Canada Flag of Canada
IATA Code: YKQ
ICAO Code: CYKQ
Coordinates: 51°28′23″N, 78°45′29″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W