Air Miles Calculator logo

How far is Lopez, WA, from St. John's?

The distance between St. John's (St. John's International Airport) and Lopez (Lopez Island Airport) is 3133 miles / 5042 kilometers / 2722 nautical miles.

The driving distance from St. John's (YYT) to Lopez (LPS) is 4959 miles / 7981 kilometers, and travel time by car is about 104 hours 32 minutes.

St. John's International Airport – Lopez Island Airport

Distance arrow
3133
Miles
Distance arrow
5042
Kilometers
Distance arrow
2722
Nautical miles
Flight time duration
6 h 25 min
Time Difference
4 h 30 min
CO2 emission
350 kg

Search flights

Distance from St. John's to Lopez

There are several ways to calculate the distance from St. John's to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 3132.806 miles
  • 5041.763 kilometers
  • 2722.334 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3123.531 miles
  • 5026.836 kilometers
  • 2714.274 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. John's to Lopez?

The estimated flight time from St. John's International Airport to Lopez Island Airport is 6 hours and 25 minutes.

Flight carbon footprint between St. John's International Airport (YYT) and Lopez Island Airport (LPS)

On average, flying from St. John's to Lopez generates about 350 kg of CO2 per passenger, and 350 kilograms equals 772 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from St. John's to Lopez

See the map of the shortest flight path between St. John's International Airport (YYT) and Lopez Island Airport (LPS).

Airport information

Origin St. John's International Airport
City: St. John's
Country: Canada Flag of Canada
IATA Code: YYT
ICAO Code: CYYT
Coordinates: 47°37′6″N, 52°45′6″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W