Air Miles Calculator logo

How far is Whale Cove from Port Hardy?

The distance between Port Hardy (Port Hardy Airport) and Whale Cove (Whale Cove Airport) is 1526 miles / 2456 kilometers / 1326 nautical miles.

The driving distance from Port Hardy (YZT) to Whale Cove (YXN) is 2060 miles / 3315 kilometers, and travel time by car is about 47 hours 4 minutes.

Port Hardy Airport – Whale Cove Airport

Distance arrow
1526
Miles
Distance arrow
2456
Kilometers
Distance arrow
1326
Nautical miles

Search flights

Distance from Port Hardy to Whale Cove

There are several ways to calculate the distance from Port Hardy to Whale Cove. Here are two standard methods:

Vincenty's formula (applied above)
  • 1526.105 miles
  • 2456.028 kilometers
  • 1326.149 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1521.739 miles
  • 2449.002 kilometers
  • 1322.355 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Port Hardy to Whale Cove?

The estimated flight time from Port Hardy Airport to Whale Cove Airport is 3 hours and 23 minutes.

Flight carbon footprint between Port Hardy Airport (YZT) and Whale Cove Airport (YXN)

On average, flying from Port Hardy to Whale Cove generates about 181 kg of CO2 per passenger, and 181 kilograms equals 399 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Port Hardy to Whale Cove

See the map of the shortest flight path between Port Hardy Airport (YZT) and Whale Cove Airport (YXN).

Airport information

Origin Port Hardy Airport
City: Port Hardy
Country: Canada Flag of Canada
IATA Code: YZT
ICAO Code: CYZT
Coordinates: 50°40′50″N, 127°22′1″W
Destination Whale Cove Airport
City: Whale Cove
Country: Canada Flag of Canada
IATA Code: YXN
ICAO Code: CYXN
Coordinates: 62°14′24″N, 92°35′53″W