Air Miles Calculator logo

How far is Lopez, WA, from Newburgh, NY?

The distance between Newburgh (Stewart International Airport) and Lopez (Lopez Island Airport) is 2398 miles / 3860 kilometers / 2084 nautical miles.

The driving distance from Newburgh (SWF) to Lopez (LPS) is 2932 miles / 4719 kilometers, and travel time by car is about 53 hours 37 minutes.

Stewart International Airport – Lopez Island Airport

Distance arrow
2398
Miles
Distance arrow
3860
Kilometers
Distance arrow
2084
Nautical miles

Search flights

Distance from Newburgh to Lopez

There are several ways to calculate the distance from Newburgh to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2398.309 miles
  • 3859.705 kilometers
  • 2084.074 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2391.957 miles
  • 3849.482 kilometers
  • 2078.554 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Newburgh to Lopez?

The estimated flight time from Stewart International Airport to Lopez Island Airport is 5 hours and 2 minutes.

Flight carbon footprint between Stewart International Airport (SWF) and Lopez Island Airport (LPS)

On average, flying from Newburgh to Lopez generates about 263 kg of CO2 per passenger, and 263 kilograms equals 581 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Newburgh to Lopez

See the map of the shortest flight path between Stewart International Airport (SWF) and Lopez Island Airport (LPS).

Airport information

Origin Stewart International Airport
City: Newburgh, NY
Country: United States Flag of United States
IATA Code: SWF
ICAO Code: KSWF
Coordinates: 41°30′14″N, 74°6′17″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W