Air Miles Calculator logo

How far is Lopez, WA, from St. Anthony?

The distance between St. Anthony (St. Anthony Airport) and Lopez (Lopez Island Airport) is 2884 miles / 4641 kilometers / 2506 nautical miles.

The driving distance from St. Anthony (YAY) to Lopez (LPS) is 4425 miles / 7122 kilometers, and travel time by car is about 94 hours 10 minutes.

St. Anthony Airport – Lopez Island Airport

Distance arrow
2884
Miles
Distance arrow
4641
Kilometers
Distance arrow
2506
Nautical miles
Flight time duration
5 h 57 min
Time Difference
4 h 30 min
CO2 emission
320 kg

Search flights

Distance from St. Anthony to Lopez

There are several ways to calculate the distance from St. Anthony to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2883.795 miles
  • 4641.018 kilometers
  • 2505.950 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2874.975 miles
  • 4626.824 kilometers
  • 2498.285 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. Anthony to Lopez?

The estimated flight time from St. Anthony Airport to Lopez Island Airport is 5 hours and 57 minutes.

Flight carbon footprint between St. Anthony Airport (YAY) and Lopez Island Airport (LPS)

On average, flying from St. Anthony to Lopez generates about 320 kg of CO2 per passenger, and 320 kilograms equals 706 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from St. Anthony to Lopez

See the map of the shortest flight path between St. Anthony Airport (YAY) and Lopez Island Airport (LPS).

Airport information

Origin St. Anthony Airport
City: St. Anthony
Country: Canada Flag of Canada
IATA Code: YAY
ICAO Code: CYAY
Coordinates: 51°23′30″N, 56°4′59″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W