Air Miles Calculator logo

How far is Nanaimo from Seattle, WA?

The distance between Seattle (Seattle–Tacoma International Airport) and Nanaimo (Nanaimo Airport) is 132 miles / 213 kilometers / 115 nautical miles.

The driving distance from Seattle (SEA) to Nanaimo (YCD) is 202 miles / 325 kilometers, and travel time by car is about 5 hours 29 minutes.

Seattle–Tacoma International Airport – Nanaimo Airport

Distance arrow
132
Miles
Distance arrow
213
Kilometers
Distance arrow
115
Nautical miles

Search flights

Distance from Seattle to Nanaimo

There are several ways to calculate the distance from Seattle to Nanaimo. Here are two standard methods:

Vincenty's formula (applied above)
  • 132.131 miles
  • 212.644 kilometers
  • 114.818 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 132.014 miles
  • 212.456 kilometers
  • 114.717 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Seattle to Nanaimo?

The estimated flight time from Seattle–Tacoma International Airport to Nanaimo Airport is 45 minutes.

What is the time difference between Seattle and Nanaimo?

There is no time difference between Seattle and Nanaimo.

Flight carbon footprint between Seattle–Tacoma International Airport (SEA) and Nanaimo Airport (YCD)

On average, flying from Seattle to Nanaimo generates about 44 kg of CO2 per passenger, and 44 kilograms equals 98 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Seattle to Nanaimo

See the map of the shortest flight path between Seattle–Tacoma International Airport (SEA) and Nanaimo Airport (YCD).

Airport information

Origin Seattle–Tacoma International Airport
City: Seattle, WA
Country: United States Flag of United States
IATA Code: SEA
ICAO Code: KSEA
Coordinates: 47°26′56″N, 122°18′32″W
Destination Nanaimo Airport
City: Nanaimo
Country: Canada Flag of Canada
IATA Code: YCD
ICAO Code: CYCD
Coordinates: 49°3′8″N, 123°52′12″W