Air Miles Calculator logo

How far is Cranbrook from Hall Beach?

The distance between Hall Beach (Hall Beach Airport) and Cranbrook (Cranbrook/Canadian Rockies International Airport) is 1760 miles / 2833 kilometers / 1530 nautical miles.

Hall Beach Airport – Cranbrook/Canadian Rockies International Airport

Distance arrow
1760
Miles
Distance arrow
2833
Kilometers
Distance arrow
1530
Nautical miles

Search flights

Distance from Hall Beach to Cranbrook

There are several ways to calculate the distance from Hall Beach to Cranbrook. Here are two standard methods:

Vincenty's formula (applied above)
  • 1760.403 miles
  • 2833.093 kilometers
  • 1529.748 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1755.940 miles
  • 2825.911 kilometers
  • 1525.870 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hall Beach to Cranbrook?

The estimated flight time from Hall Beach Airport to Cranbrook/Canadian Rockies International Airport is 3 hours and 49 minutes.

Flight carbon footprint between Hall Beach Airport (YUX) and Cranbrook/Canadian Rockies International Airport (YXC)

On average, flying from Hall Beach to Cranbrook generates about 197 kg of CO2 per passenger, and 197 kilograms equals 434 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hall Beach to Cranbrook

See the map of the shortest flight path between Hall Beach Airport (YUX) and Cranbrook/Canadian Rockies International Airport (YXC).

Airport information

Origin Hall Beach Airport
City: Hall Beach
Country: Canada Flag of Canada
IATA Code: YUX
ICAO Code: CYUX
Coordinates: 68°46′33″N, 81°14′36″W
Destination Cranbrook/Canadian Rockies International Airport
City: Cranbrook
Country: Canada Flag of Canada
IATA Code: YXC
ICAO Code: CYXC
Coordinates: 49°36′38″N, 115°46′55″W