Air Miles Calculator logo

How far is Lopez, WA, from Miami, FL?

The distance between Miami (Miami International Airport) and Lopez (Lopez Island Airport) is 2774 miles / 4465 kilometers / 2411 nautical miles.

The driving distance from Miami (MIA) to Lopez (LPS) is 3423 miles / 5509 kilometers, and travel time by car is about 61 hours 46 minutes.

Miami International Airport – Lopez Island Airport

Distance arrow
2774
Miles
Distance arrow
4465
Kilometers
Distance arrow
2411
Nautical miles

Search flights

Distance from Miami to Lopez

There are several ways to calculate the distance from Miami to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2774.266 miles
  • 4464.748 kilometers
  • 2410.771 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2771.570 miles
  • 4460.410 kilometers
  • 2408.429 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Miami to Lopez?

The estimated flight time from Miami International Airport to Lopez Island Airport is 5 hours and 45 minutes.

Flight carbon footprint between Miami International Airport (MIA) and Lopez Island Airport (LPS)

On average, flying from Miami to Lopez generates about 307 kg of CO2 per passenger, and 307 kilograms equals 678 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Miami to Lopez

See the map of the shortest flight path between Miami International Airport (MIA) and Lopez Island Airport (LPS).

Airport information

Origin Miami International Airport
City: Miami, FL
Country: United States Flag of United States
IATA Code: MIA
ICAO Code: KMIA
Coordinates: 25°47′35″N, 80°17′26″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W