Air Miles Calculator logo

How far is Lopez, WA, from Jacksonville, NC?

The distance between Jacksonville (Albert J Ellis Airport) and Lopez (Lopez Island Airport) is 2485 miles / 3999 kilometers / 2159 nautical miles.

The driving distance from Jacksonville (OAJ) to Lopez (LPS) is 3045 miles / 4901 kilometers, and travel time by car is about 56 hours 8 minutes.

Albert J Ellis Airport – Lopez Island Airport

Distance arrow
2485
Miles
Distance arrow
3999
Kilometers
Distance arrow
2159
Nautical miles

Search flights

Distance from Jacksonville to Lopez

There are several ways to calculate the distance from Jacksonville to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2484.589 miles
  • 3998.558 kilometers
  • 2159.049 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2479.511 miles
  • 3990.386 kilometers
  • 2154.636 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jacksonville to Lopez?

The estimated flight time from Albert J Ellis Airport to Lopez Island Airport is 5 hours and 12 minutes.

Flight carbon footprint between Albert J Ellis Airport (OAJ) and Lopez Island Airport (LPS)

On average, flying from Jacksonville to Lopez generates about 273 kg of CO2 per passenger, and 273 kilograms equals 603 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jacksonville to Lopez

See the map of the shortest flight path between Albert J Ellis Airport (OAJ) and Lopez Island Airport (LPS).

Airport information

Origin Albert J Ellis Airport
City: Jacksonville, NC
Country: United States Flag of United States
IATA Code: OAJ
ICAO Code: KOAJ
Coordinates: 34°49′45″N, 77°36′43″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W