Air Miles Calculator logo

How far is Lopez, WA, from La Tabatière?

The distance between La Tabatière (La Tabatière Airport) and Lopez (Lopez Island Airport) is 2783 miles / 4479 kilometers / 2418 nautical miles.

The driving distance from La Tabatière (ZLT) to Lopez (LPS) is 3864 miles / 6218 kilometers, and travel time by car is about 125 hours 12 minutes.

La Tabatière Airport – Lopez Island Airport

Distance arrow
2783
Miles
Distance arrow
4479
Kilometers
Distance arrow
2418
Nautical miles

Search flights

Distance from La Tabatière to Lopez

There are several ways to calculate the distance from La Tabatière to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2783.061 miles
  • 4478.902 kilometers
  • 2418.414 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2774.580 miles
  • 4465.253 kilometers
  • 2411.044 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from La Tabatière to Lopez?

The estimated flight time from La Tabatière Airport to Lopez Island Airport is 5 hours and 46 minutes.

Flight carbon footprint between La Tabatière Airport (ZLT) and Lopez Island Airport (LPS)

On average, flying from La Tabatière to Lopez generates about 308 kg of CO2 per passenger, and 308 kilograms equals 680 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from La Tabatière to Lopez

See the map of the shortest flight path between La Tabatière Airport (ZLT) and Lopez Island Airport (LPS).

Airport information

Origin La Tabatière Airport
City: La Tabatière
Country: Canada Flag of Canada
IATA Code: ZLT
ICAO Code: CTU5
Coordinates: 50°49′50″N, 58°58′32″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W