Air Miles Calculator logo

How far is Port Hardy from Natashquan?

The distance between Natashquan (Natashquan Airport) and Port Hardy (Port Hardy Airport) is 2797 miles / 4502 kilometers / 2431 nautical miles.

The driving distance from Natashquan (YNA) to Port Hardy (YZT) is 3994 miles / 6427 kilometers, and travel time by car is about 79 hours 3 minutes.

Natashquan Airport – Port Hardy Airport

Distance arrow
2797
Miles
Distance arrow
4502
Kilometers
Distance arrow
2431
Nautical miles

Search flights

Distance from Natashquan to Port Hardy

There are several ways to calculate the distance from Natashquan to Port Hardy. Here are two standard methods:

Vincenty's formula (applied above)
  • 2797.160 miles
  • 4501.593 kilometers
  • 2430.666 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2788.484 miles
  • 4487.630 kilometers
  • 2423.127 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Natashquan to Port Hardy?

The estimated flight time from Natashquan Airport to Port Hardy Airport is 5 hours and 47 minutes.

Flight carbon footprint between Natashquan Airport (YNA) and Port Hardy Airport (YZT)

On average, flying from Natashquan to Port Hardy generates about 310 kg of CO2 per passenger, and 310 kilograms equals 684 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Natashquan to Port Hardy

See the map of the shortest flight path between Natashquan Airport (YNA) and Port Hardy Airport (YZT).

Airport information

Origin Natashquan Airport
City: Natashquan
Country: Canada Flag of Canada
IATA Code: YNA
ICAO Code: CYNA
Coordinates: 50°11′23″N, 61°47′21″W
Destination Port Hardy Airport
City: Port Hardy
Country: Canada Flag of Canada
IATA Code: YZT
ICAO Code: CYZT
Coordinates: 50°40′50″N, 127°22′1″W