Air Miles Calculator logo

How far is Lopez, WA, from Lloydminster?

The distance between Lloydminster (Lloydminster Airport) and Lopez (Lopez Island Airport) is 652 miles / 1050 kilometers / 567 nautical miles.

The driving distance from Lloydminster (YLL) to Lopez (LPS) is 916 miles / 1474 kilometers, and travel time by car is about 19 hours 6 minutes.

Lloydminster Airport – Lopez Island Airport

Distance arrow
652
Miles
Distance arrow
1050
Kilometers
Distance arrow
567
Nautical miles

Search flights

Distance from Lloydminster to Lopez

There are several ways to calculate the distance from Lloydminster to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 652.468 miles
  • 1050.046 kilometers
  • 566.979 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 650.882 miles
  • 1047.493 kilometers
  • 565.601 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lloydminster to Lopez?

The estimated flight time from Lloydminster Airport to Lopez Island Airport is 1 hour and 44 minutes.

Flight carbon footprint between Lloydminster Airport (YLL) and Lopez Island Airport (LPS)

On average, flying from Lloydminster to Lopez generates about 119 kg of CO2 per passenger, and 119 kilograms equals 263 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lloydminster to Lopez

See the map of the shortest flight path between Lloydminster Airport (YLL) and Lopez Island Airport (LPS).

Airport information

Origin Lloydminster Airport
City: Lloydminster
Country: Canada Flag of Canada
IATA Code: YLL
ICAO Code: CYLL
Coordinates: 53°18′33″N, 110°4′22″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W