Air Miles Calculator logo

How far is Brandon from Port Hardy?

The distance between Port Hardy (Port Hardy Airport) and Brandon (Brandon Municipal Airport) is 1208 miles / 1944 kilometers / 1050 nautical miles.

The driving distance from Port Hardy (YZT) to Brandon (YBR) is 1583 miles / 2547 kilometers, and travel time by car is about 32 hours 35 minutes.

Port Hardy Airport – Brandon Municipal Airport

Distance arrow
1208
Miles
Distance arrow
1944
Kilometers
Distance arrow
1050
Nautical miles

Search flights

Distance from Port Hardy to Brandon

There are several ways to calculate the distance from Port Hardy to Brandon. Here are two standard methods:

Vincenty's formula (applied above)
  • 1208.056 miles
  • 1944.177 kilometers
  • 1049.772 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1204.319 miles
  • 1938.164 kilometers
  • 1046.525 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Port Hardy to Brandon?

The estimated flight time from Port Hardy Airport to Brandon Municipal Airport is 2 hours and 47 minutes.

Flight carbon footprint between Port Hardy Airport (YZT) and Brandon Municipal Airport (YBR)

On average, flying from Port Hardy to Brandon generates about 162 kg of CO2 per passenger, and 162 kilograms equals 357 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Port Hardy to Brandon

See the map of the shortest flight path between Port Hardy Airport (YZT) and Brandon Municipal Airport (YBR).

Airport information

Origin Port Hardy Airport
City: Port Hardy
Country: Canada Flag of Canada
IATA Code: YZT
ICAO Code: CYZT
Coordinates: 50°40′50″N, 127°22′1″W
Destination Brandon Municipal Airport
City: Brandon
Country: Canada Flag of Canada
IATA Code: YBR
ICAO Code: CYBR
Coordinates: 49°54′36″N, 99°57′6″W