Air Miles Calculator logo

How far is Windsor from Nanaimo?

The distance between Nanaimo (Nanaimo Airport) and Windsor (Windsor International Airport) is 2010 miles / 3234 kilometers / 1746 nautical miles.

The driving distance from Nanaimo (YCD) to Windsor (YQG) is 2444 miles / 3933 kilometers, and travel time by car is about 46 hours 28 minutes.

Nanaimo Airport – Windsor International Airport

Distance arrow
2010
Miles
Distance arrow
3234
Kilometers
Distance arrow
1746
Nautical miles

Search flights

Distance from Nanaimo to Windsor

There are several ways to calculate the distance from Nanaimo to Windsor. Here are two standard methods:

Vincenty's formula (applied above)
  • 2009.660 miles
  • 3234.234 kilometers
  • 1746.347 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2004.341 miles
  • 3225.675 kilometers
  • 1741.725 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanaimo to Windsor?

The estimated flight time from Nanaimo Airport to Windsor International Airport is 4 hours and 18 minutes.

Flight carbon footprint between Nanaimo Airport (YCD) and Windsor International Airport (YQG)

On average, flying from Nanaimo to Windsor generates about 219 kg of CO2 per passenger, and 219 kilograms equals 482 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nanaimo to Windsor

See the map of the shortest flight path between Nanaimo Airport (YCD) and Windsor International Airport (YQG).

Airport information

Origin Nanaimo Airport
City: Nanaimo
Country: Canada Flag of Canada
IATA Code: YCD
ICAO Code: CYCD
Coordinates: 49°3′8″N, 123°52′12″W
Destination Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W