Air Miles Calculator logo

How far is Lopez, WA, from Barrow, AK?

The distance between Barrow (Wiley Post–Will Rogers Memorial Airport) and Lopez (Lopez Island Airport) is 1915 miles / 3081 kilometers / 1664 nautical miles.

The driving distance from Barrow (BRW) to Lopez (LPS) is 2764 miles / 4448 kilometers, and travel time by car is about 64 hours 35 minutes.

Wiley Post–Will Rogers Memorial Airport – Lopez Island Airport

Distance arrow
1915
Miles
Distance arrow
3081
Kilometers
Distance arrow
1664
Nautical miles

Search flights

Distance from Barrow to Lopez

There are several ways to calculate the distance from Barrow to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1914.592 miles
  • 3081.237 kilometers
  • 1663.735 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1909.995 miles
  • 3073.839 kilometers
  • 1659.740 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Barrow to Lopez?

The estimated flight time from Wiley Post–Will Rogers Memorial Airport to Lopez Island Airport is 4 hours and 7 minutes.

Flight carbon footprint between Wiley Post–Will Rogers Memorial Airport (BRW) and Lopez Island Airport (LPS)

On average, flying from Barrow to Lopez generates about 210 kg of CO2 per passenger, and 210 kilograms equals 462 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Barrow to Lopez

See the map of the shortest flight path between Wiley Post–Will Rogers Memorial Airport (BRW) and Lopez Island Airport (LPS).

Airport information

Origin Wiley Post–Will Rogers Memorial Airport
City: Barrow, AK
Country: United States Flag of United States
IATA Code: BRW
ICAO Code: PABR
Coordinates: 71°17′7″N, 156°45′57″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W