Air Miles Calculator logo

How far is Lopez, WA, from Belleville, IL?

The distance between Belleville (Scott Air Force Base) and Lopez (Lopez Island Airport) is 1782 miles / 2867 kilometers / 1548 nautical miles.

The driving distance from Belleville (BLV) to Lopez (LPS) is 2230 miles / 3589 kilometers, and travel time by car is about 39 hours 55 minutes.

Scott Air Force Base – Lopez Island Airport

Distance arrow
1782
Miles
Distance arrow
2867
Kilometers
Distance arrow
1548
Nautical miles

Search flights

Distance from Belleville to Lopez

There are several ways to calculate the distance from Belleville to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 1781.636 miles
  • 2867.265 kilometers
  • 1548.199 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1777.766 miles
  • 2861.036 kilometers
  • 1544.836 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Belleville to Lopez?

The estimated flight time from Scott Air Force Base to Lopez Island Airport is 3 hours and 52 minutes.

Flight carbon footprint between Scott Air Force Base (BLV) and Lopez Island Airport (LPS)

On average, flying from Belleville to Lopez generates about 199 kg of CO2 per passenger, and 199 kilograms equals 438 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Belleville to Lopez

See the map of the shortest flight path between Scott Air Force Base (BLV) and Lopez Island Airport (LPS).

Airport information

Origin Scott Air Force Base
City: Belleville, IL
Country: United States Flag of United States
IATA Code: BLV
ICAO Code: KBLV
Coordinates: 38°32′42″N, 89°50′6″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W