Air Miles Calculator logo

How far is Nanaimo from Prince George?

The distance between Prince George (Prince George Airport) and Nanaimo (Nanaimo Harbour Water Airport) is 330 miles / 531 kilometers / 287 nautical miles.

The driving distance from Prince George (YXS) to Nanaimo (ZNA) is 536 miles / 862 kilometers, and travel time by car is about 12 hours 40 minutes.

Prince George Airport – Nanaimo Harbour Water Airport

Distance arrow
330
Miles
Distance arrow
531
Kilometers
Distance arrow
287
Nautical miles

Search flights

Distance from Prince George to Nanaimo

There are several ways to calculate the distance from Prince George to Nanaimo. Here are two standard methods:

Vincenty's formula (applied above)
  • 329.913 miles
  • 530.944 kilometers
  • 286.687 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 329.702 miles
  • 530.604 kilometers
  • 286.503 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Prince George to Nanaimo?

The estimated flight time from Prince George Airport to Nanaimo Harbour Water Airport is 1 hour and 7 minutes.

What is the time difference between Prince George and Nanaimo?

There is no time difference between Prince George and Nanaimo.

Flight carbon footprint between Prince George Airport (YXS) and Nanaimo Harbour Water Airport (ZNA)

On average, flying from Prince George to Nanaimo generates about 74 kg of CO2 per passenger, and 74 kilograms equals 162 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Prince George to Nanaimo

See the map of the shortest flight path between Prince George Airport (YXS) and Nanaimo Harbour Water Airport (ZNA).

Airport information

Origin Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W
Destination Nanaimo Harbour Water Airport
City: Nanaimo
Country: Canada Flag of Canada
IATA Code: ZNA
ICAO Code: CAC8
Coordinates: 49°10′59″N, 123°56′59″W