Air Miles Calculator logo

How far is Lopez, WA, from Rankin Inlet?

The distance between Rankin Inlet (Rankin Inlet Airport) and Lopez (Lopez Island Airport) is 1535 miles / 2470 kilometers / 1334 nautical miles.

The driving distance from Rankin Inlet (YRT) to Lopez (LPS) is 1818 miles / 2926 kilometers, and travel time by car is about 41 hours 43 minutes.

Rankin Inlet Airport – Lopez Island Airport

Distance arrow
1535
Miles
Distance arrow
2470
Kilometers
Distance arrow
1334
Nautical miles

Search flights

Distance from Rankin Inlet to Lopez

There are several ways to calculate the distance from Rankin Inlet to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1535.053 miles
  • 2470.429 kilometers
  • 1333.925 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1531.244 miles
  • 2464.299 kilometers
  • 1330.615 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Rankin Inlet to Lopez?

The estimated flight time from Rankin Inlet Airport to Lopez Island Airport is 3 hours and 24 minutes.

Flight carbon footprint between Rankin Inlet Airport (YRT) and Lopez Island Airport (LPS)

On average, flying from Rankin Inlet to Lopez generates about 182 kg of CO2 per passenger, and 182 kilograms equals 401 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Rankin Inlet to Lopez

See the map of the shortest flight path between Rankin Inlet Airport (YRT) and Lopez Island Airport (LPS).

Airport information

Origin Rankin Inlet Airport
City: Rankin Inlet
Country: Canada Flag of Canada
IATA Code: YRT
ICAO Code: CYRT
Coordinates: 62°48′41″N, 92°6′56″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W