Air Miles Calculator logo

How far is Lopez, WA, from Muskegon, MI?

The distance between Muskegon (Muskegon County Airport) and Lopez (Lopez Island Airport) is 1791 miles / 2883 kilometers / 1557 nautical miles.

The driving distance from Muskegon (MKG) to Lopez (LPS) is 2160 miles / 3476 kilometers, and travel time by car is about 40 hours 23 minutes.

Muskegon County Airport – Lopez Island Airport

Distance arrow
1791
Miles
Distance arrow
2883
Kilometers
Distance arrow
1557
Nautical miles

Search flights

Distance from Muskegon to Lopez

There are several ways to calculate the distance from Muskegon to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1791.468 miles
  • 2883.089 kilometers
  • 1556.743 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1786.633 miles
  • 2875.308 kilometers
  • 1552.542 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Muskegon to Lopez?

The estimated flight time from Muskegon County Airport to Lopez Island Airport is 3 hours and 53 minutes.

Flight carbon footprint between Muskegon County Airport (MKG) and Lopez Island Airport (LPS)

On average, flying from Muskegon to Lopez generates about 199 kg of CO2 per passenger, and 199 kilograms equals 440 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Muskegon to Lopez

See the map of the shortest flight path between Muskegon County Airport (MKG) and Lopez Island Airport (LPS).

Airport information

Origin Muskegon County Airport
City: Muskegon, MI
Country: United States Flag of United States
IATA Code: MKG
ICAO Code: KMKG
Coordinates: 43°10′10″N, 86°14′17″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W