Air Miles Calculator logo

How far is Lopez, WA, from Beaumont, TX?

The distance between Beaumont (Jack Brooks Regional Airport) and Lopez (Lopez Island Airport) is 1990 miles / 3202 kilometers / 1729 nautical miles.

The driving distance from Beaumont (BPT) to Lopez (LPS) is 2477 miles / 3986 kilometers, and travel time by car is about 45 hours 17 minutes.

Jack Brooks Regional Airport – Lopez Island Airport

Distance arrow
1990
Miles
Distance arrow
3202
Kilometers
Distance arrow
1729
Nautical miles

Search flights

Distance from Beaumont to Lopez

There are several ways to calculate the distance from Beaumont to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1989.580 miles
  • 3201.919 kilometers
  • 1728.898 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1988.021 miles
  • 3199.410 kilometers
  • 1727.543 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beaumont to Lopez?

The estimated flight time from Jack Brooks Regional Airport to Lopez Island Airport is 4 hours and 16 minutes.

Flight carbon footprint between Jack Brooks Regional Airport (BPT) and Lopez Island Airport (LPS)

On average, flying from Beaumont to Lopez generates about 217 kg of CO2 per passenger, and 217 kilograms equals 478 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Beaumont to Lopez

See the map of the shortest flight path between Jack Brooks Regional Airport (BPT) and Lopez Island Airport (LPS).

Airport information

Origin Jack Brooks Regional Airport
City: Beaumont, TX
Country: United States Flag of United States
IATA Code: BPT
ICAO Code: KBPT
Coordinates: 29°57′2″N, 94°1′14″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W