Air Miles Calculator logo

How far is Lopez, WA, from Tête-à-la-Baleine?

The distance between Tête-à-la-Baleine (Tête-à-la-Baleine Airport) and Lopez (Lopez Island Airport) is 2771 miles / 4459 kilometers / 2408 nautical miles.

The driving distance from Tête-à-la-Baleine (ZTB) to Lopez (LPS) is 3841 miles / 6181 kilometers, and travel time by car is about 115 hours 35 minutes.

Tête-à-la-Baleine Airport – Lopez Island Airport

Distance arrow
2771
Miles
Distance arrow
4459
Kilometers
Distance arrow
2408
Nautical miles

Search flights

Distance from Tête-à-la-Baleine to Lopez

There are several ways to calculate the distance from Tête-à-la-Baleine to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2770.684 miles
  • 4458.984 kilometers
  • 2407.659 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2762.250 miles
  • 4445.410 kilometers
  • 2400.330 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tête-à-la-Baleine to Lopez?

The estimated flight time from Tête-à-la-Baleine Airport to Lopez Island Airport is 5 hours and 44 minutes.

Flight carbon footprint between Tête-à-la-Baleine Airport (ZTB) and Lopez Island Airport (LPS)

On average, flying from Tête-à-la-Baleine to Lopez generates about 307 kg of CO2 per passenger, and 307 kilograms equals 677 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tête-à-la-Baleine to Lopez

See the map of the shortest flight path between Tête-à-la-Baleine Airport (ZTB) and Lopez Island Airport (LPS).

Airport information

Origin Tête-à-la-Baleine Airport
City: Tête-à-la-Baleine
Country: Canada Flag of Canada
IATA Code: ZTB
ICAO Code: CTB6
Coordinates: 50°40′27″N, 59°23′0″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W