Air Miles Calculator logo

How far is Thunder Bay from Port Hardy?

The distance between Port Hardy (Port Hardy Airport) and Thunder Bay (Thunder Bay International Airport) is 1700 miles / 2736 kilometers / 1477 nautical miles.

The driving distance from Port Hardy (YZT) to Thunder Bay (YQT) is 2287 miles / 3681 kilometers, and travel time by car is about 43 hours 54 minutes.

Port Hardy Airport – Thunder Bay International Airport

Distance arrow
1700
Miles
Distance arrow
2736
Kilometers
Distance arrow
1477
Nautical miles

Search flights

Distance from Port Hardy to Thunder Bay

There are several ways to calculate the distance from Port Hardy to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1699.839 miles
  • 2735.626 kilometers
  • 1477.120 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1694.692 miles
  • 2727.342 kilometers
  • 1472.647 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Port Hardy to Thunder Bay?

The estimated flight time from Port Hardy Airport to Thunder Bay International Airport is 3 hours and 43 minutes.

Flight carbon footprint between Port Hardy Airport (YZT) and Thunder Bay International Airport (YQT)

On average, flying from Port Hardy to Thunder Bay generates about 193 kg of CO2 per passenger, and 193 kilograms equals 425 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Port Hardy to Thunder Bay

See the map of the shortest flight path between Port Hardy Airport (YZT) and Thunder Bay International Airport (YQT).

Airport information

Origin Port Hardy Airport
City: Port Hardy
Country: Canada Flag of Canada
IATA Code: YZT
ICAO Code: CYZT
Coordinates: 50°40′50″N, 127°22′1″W
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W