Air Miles Calculator logo

How far is Lopez, WA, from Wunnumin Lake?

The distance between Wunnumin Lake (Wunnumin Lake Airport) and Lopez (Lopez Island Airport) is 1494 miles / 2404 kilometers / 1298 nautical miles.

The driving distance from Wunnumin Lake (WNN) to Lopez (LPS) is 2034 miles / 3273 kilometers, and travel time by car is about 48 hours 23 minutes.

Wunnumin Lake Airport – Lopez Island Airport

Distance arrow
1494
Miles
Distance arrow
2404
Kilometers
Distance arrow
1298
Nautical miles

Search flights

Distance from Wunnumin Lake to Lopez

There are several ways to calculate the distance from Wunnumin Lake to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1494.082 miles
  • 2404.493 kilometers
  • 1298.322 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1489.591 miles
  • 2397.264 kilometers
  • 1294.419 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wunnumin Lake to Lopez?

The estimated flight time from Wunnumin Lake Airport to Lopez Island Airport is 3 hours and 19 minutes.

Flight carbon footprint between Wunnumin Lake Airport (WNN) and Lopez Island Airport (LPS)

On average, flying from Wunnumin Lake to Lopez generates about 179 kg of CO2 per passenger, and 179 kilograms equals 395 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wunnumin Lake to Lopez

See the map of the shortest flight path between Wunnumin Lake Airport (WNN) and Lopez Island Airport (LPS).

Airport information

Origin Wunnumin Lake Airport
City: Wunnumin Lake
Country: Canada Flag of Canada
IATA Code: WNN
ICAO Code: CKL3
Coordinates: 52°53′38″N, 89°17′21″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W