Air Miles Calculator logo

How far is Lopez, WA, from Kangiqsujuaq?

The distance between Kangiqsujuaq (Kangiqsujuaq (Wakeham Bay) Airport) and Lopez (Lopez Island Airport) is 2149 miles / 3459 kilometers / 1868 nautical miles.

The driving distance from Kangiqsujuaq (YWB) to Lopez (LPS) is 3379 miles / 5438 kilometers, and travel time by car is about 75 hours 48 minutes.

Kangiqsujuaq (Wakeham Bay) Airport – Lopez Island Airport

Distance arrow
2149
Miles
Distance arrow
3459
Kilometers
Distance arrow
1868
Nautical miles

Search flights

Distance from Kangiqsujuaq to Lopez

There are several ways to calculate the distance from Kangiqsujuaq to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2149.383 miles
  • 3459.096 kilometers
  • 1867.763 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2143.045 miles
  • 3448.897 kilometers
  • 1862.255 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kangiqsujuaq to Lopez?

The estimated flight time from Kangiqsujuaq (Wakeham Bay) Airport to Lopez Island Airport is 4 hours and 34 minutes.

Flight carbon footprint between Kangiqsujuaq (Wakeham Bay) Airport (YWB) and Lopez Island Airport (LPS)

On average, flying from Kangiqsujuaq to Lopez generates about 235 kg of CO2 per passenger, and 235 kilograms equals 517 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kangiqsujuaq to Lopez

See the map of the shortest flight path between Kangiqsujuaq (Wakeham Bay) Airport (YWB) and Lopez Island Airport (LPS).

Airport information

Origin Kangiqsujuaq (Wakeham Bay) Airport
City: Kangiqsujuaq
Country: Canada Flag of Canada
IATA Code: YWB
ICAO Code: CYKG
Coordinates: 61°35′18″N, 71°55′45″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W