Air Miles Calculator logo

How far is Lopez, WA, from Charlotte, NC?

The distance between Charlotte (Charlotte Douglas International Airport) and Lopez (Lopez Island Airport) is 2317 miles / 3728 kilometers / 2013 nautical miles.

The driving distance from Charlotte (CLT) to Lopez (LPS) is 2912 miles / 4687 kilometers, and travel time by car is about 52 hours 44 minutes.

Charlotte Douglas International Airport – Lopez Island Airport

Distance arrow
2317
Miles
Distance arrow
3728
Kilometers
Distance arrow
2013
Nautical miles

Search flights

Distance from Charlotte to Lopez

There are several ways to calculate the distance from Charlotte to Lopez. Here are two standard methods:

Vincenty's formula (applied above)
  • 2316.782 miles
  • 3728.499 kilometers
  • 2013.228 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2312.119 miles
  • 3720.994 kilometers
  • 2009.176 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charlotte to Lopez?

The estimated flight time from Charlotte Douglas International Airport to Lopez Island Airport is 4 hours and 53 minutes.

Flight carbon footprint between Charlotte Douglas International Airport (CLT) and Lopez Island Airport (LPS)

On average, flying from Charlotte to Lopez generates about 254 kg of CO2 per passenger, and 254 kilograms equals 560 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Charlotte to Lopez

See the map of the shortest flight path between Charlotte Douglas International Airport (CLT) and Lopez Island Airport (LPS).

Airport information

Origin Charlotte Douglas International Airport
City: Charlotte, NC
Country: United States Flag of United States
IATA Code: CLT
ICAO Code: KCLT
Coordinates: 35°12′50″N, 80°56′35″W
Destination Lopez Island Airport
City: Lopez, WA
Country: United States Flag of United States
IATA Code: LPS
ICAO Code: S31
Coordinates: 48°29′2″N, 122°56′16″W