Air Miles Calculator logo

How far is Nanaimo from Wilkes-Barre, PA?

The distance between Wilkes-Barre (Wilkes-Barre/Scranton International Airport) and Nanaimo (Nanaimo Airport) is 2368 miles / 3811 kilometers / 2058 nautical miles.

The driving distance from Wilkes-Barre (AVP) to Nanaimo (YCD) is 2920 miles / 4700 kilometers, and travel time by car is about 54 hours 2 minutes.

Wilkes-Barre/Scranton International Airport – Nanaimo Airport

Distance arrow
2368
Miles
Distance arrow
3811
Kilometers
Distance arrow
2058
Nautical miles

Search flights

Distance from Wilkes-Barre to Nanaimo

There are several ways to calculate the distance from Wilkes-Barre to Nanaimo. Here are two standard methods:

Vincenty's formula (applied above)
  • 2368.339 miles
  • 3811.473 kilometers
  • 2058.031 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2362.119 miles
  • 3801.463 kilometers
  • 2052.626 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wilkes-Barre to Nanaimo?

The estimated flight time from Wilkes-Barre/Scranton International Airport to Nanaimo Airport is 4 hours and 59 minutes.

Flight carbon footprint between Wilkes-Barre/Scranton International Airport (AVP) and Nanaimo Airport (YCD)

On average, flying from Wilkes-Barre to Nanaimo generates about 260 kg of CO2 per passenger, and 260 kilograms equals 573 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wilkes-Barre to Nanaimo

See the map of the shortest flight path between Wilkes-Barre/Scranton International Airport (AVP) and Nanaimo Airport (YCD).

Airport information

Origin Wilkes-Barre/Scranton International Airport
City: Wilkes-Barre, PA
Country: United States Flag of United States
IATA Code: AVP
ICAO Code: KAVP
Coordinates: 41°20′18″N, 75°43′24″W
Destination Nanaimo Airport
City: Nanaimo
Country: Canada Flag of Canada
IATA Code: YCD
ICAO Code: CYCD
Coordinates: 49°3′8″N, 123°52′12″W