Air Miles Calculator logo

How far is Wekweètì from Nanaimo?

The distance between Nanaimo (Nanaimo Harbour Water Airport) and Wekweètì (Wekweètì Airport) is 1101 miles / 1772 kilometers / 957 nautical miles.

The driving distance from Nanaimo (ZNA) to Wekweètì (YFJ) is 1653 miles / 2660 kilometers, and travel time by car is about 37 hours 55 minutes.

Nanaimo Harbour Water Airport – Wekweètì Airport

Distance arrow
1101
Miles
Distance arrow
1772
Kilometers
Distance arrow
957
Nautical miles

Search flights

Distance from Nanaimo to Wekweètì

There are several ways to calculate the distance from Nanaimo to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 1101.305 miles
  • 1772.378 kilometers
  • 957.007 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1099.509 miles
  • 1769.488 kilometers
  • 955.447 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanaimo to Wekweètì?

The estimated flight time from Nanaimo Harbour Water Airport to Wekweètì Airport is 2 hours and 35 minutes.

Flight carbon footprint between Nanaimo Harbour Water Airport (ZNA) and Wekweètì Airport (YFJ)

On average, flying from Nanaimo to Wekweètì generates about 157 kg of CO2 per passenger, and 157 kilograms equals 346 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nanaimo to Wekweètì

See the map of the shortest flight path between Nanaimo Harbour Water Airport (ZNA) and Wekweètì Airport (YFJ).

Airport information

Origin Nanaimo Harbour Water Airport
City: Nanaimo
Country: Canada Flag of Canada
IATA Code: ZNA
ICAO Code: CAC8
Coordinates: 49°10′59″N, 123°56′59″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W