Air Miles Calculator logo

How far is Lopez, WA, from Tulsa, OK?

The distance between Tulsa (Tulsa International Airport) and Lopez (Lopez Island Airport) is 1611 miles / 2593 kilometers / 1400 nautical miles.

The driving distance from Tulsa (TUL) to Lopez (LPS) is 2090 miles / 3364 kilometers, and travel time by car is about 37 hours 2 minutes.

Tulsa International Airport – Lopez Island Airport

Distance arrow
1611
Miles
Distance arrow
2593
Kilometers
Distance arrow
1400
Nautical miles

Search flights

Distance from Tulsa to Lopez

There are several ways to calculate the distance from Tulsa to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1611.049 miles
  • 2592.732 kilometers
  • 1399.963 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1608.455 miles
  • 2588.557 kilometers
  • 1397.709 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tulsa to Lopez?

The estimated flight time from Tulsa International Airport to Lopez Island Airport is 3 hours and 33 minutes.

Flight carbon footprint between Tulsa International Airport (TUL) and Lopez Island Airport (LPS)

On average, flying from Tulsa to Lopez generates about 187 kg of CO2 per passenger, and 187 kilograms equals 411 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tulsa to Lopez

See the map of the shortest flight path between Tulsa International Airport (TUL) and Lopez Island Airport (LPS).

Airport information

Origin Tulsa International Airport
City: Tulsa, OK
Country: United States Flag of United States
IATA Code: TUL
ICAO Code: KTUL
Coordinates: 36°11′54″N, 95°53′17″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W