Air Miles Calculator logo

How far is Lopez, WA, from May Creek, AK?

The distance between May Creek (May Creek Airport) and Lopez (Lopez Island Airport) is 1178 miles / 1895 kilometers / 1023 nautical miles.

The driving distance from May Creek (MYK) to Lopez (LPS) is 2276 miles / 3663 kilometers, and travel time by car is about 48 hours 11 minutes.

May Creek Airport – Lopez Island Airport

Distance arrow
1178
Miles
Distance arrow
1895
Kilometers
Distance arrow
1023
Nautical miles

Search flights

Distance from May Creek to Lopez

There are several ways to calculate the distance from May Creek to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1177.804 miles
  • 1895.491 kilometers
  • 1023.484 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1175.359 miles
  • 1891.558 kilometers
  • 1021.359 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from May Creek to Lopez?

The estimated flight time from May Creek Airport to Lopez Island Airport is 2 hours and 43 minutes.

Flight carbon footprint between May Creek Airport (MYK) and Lopez Island Airport (LPS)

On average, flying from May Creek to Lopez generates about 160 kg of CO2 per passenger, and 160 kilograms equals 354 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from May Creek to Lopez

See the map of the shortest flight path between May Creek Airport (MYK) and Lopez Island Airport (LPS).

Airport information

Origin May Creek Airport
City: May Creek, AK
Country: United States Flag of United States
IATA Code: MYK
ICAO Code: MYK
Coordinates: 61°20′8″N, 142°41′13″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W