Air Miles Calculator logo

How far is Lopez, WA, from Kingfisher Lake?

The distance between Kingfisher Lake (Kingfisher Lake Airport) and Lopez (Lopez Island Airport) is 1470 miles / 2366 kilometers / 1278 nautical miles.

The driving distance from Kingfisher Lake (KIF) to Lopez (LPS) is 2034 miles / 3273 kilometers, and travel time by car is about 48 hours 23 minutes.

Kingfisher Lake Airport – Lopez Island Airport

Distance arrow
1470
Miles
Distance arrow
2366
Kilometers
Distance arrow
1278
Nautical miles

Search flights

Distance from Kingfisher Lake to Lopez

There are several ways to calculate the distance from Kingfisher Lake to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1470.236 miles
  • 2366.116 kilometers
  • 1277.600 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1465.826 miles
  • 2359.018 kilometers
  • 1273.768 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kingfisher Lake to Lopez?

The estimated flight time from Kingfisher Lake Airport to Lopez Island Airport is 3 hours and 17 minutes.

Flight carbon footprint between Kingfisher Lake Airport (KIF) and Lopez Island Airport (LPS)

On average, flying from Kingfisher Lake to Lopez generates about 178 kg of CO2 per passenger, and 178 kilograms equals 392 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kingfisher Lake to Lopez

See the map of the shortest flight path between Kingfisher Lake Airport (KIF) and Lopez Island Airport (LPS).

Airport information

Origin Kingfisher Lake Airport
City: Kingfisher Lake
Country: Canada Flag of Canada
IATA Code: KIF
ICAO Code: CNM5
Coordinates: 53°0′45″N, 89°51′19″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W