Air Miles Calculator logo

How far is Hall Beach from St. John's?

The distance between St. John's (St. John's International Airport) and Hall Beach (Hall Beach Airport) is 1762 miles / 2835 kilometers / 1531 nautical miles.

St. John's International Airport – Hall Beach Airport

Distance arrow
1762
Miles
Distance arrow
2835
Kilometers
Distance arrow
1531
Nautical miles
Flight time duration
3 h 50 min
Time Difference
1 h 30 min
CO2 emission
197 kg

Search flights

Distance from St. John's to Hall Beach

There are several ways to calculate the distance from St. John's to Hall Beach. Here are two standard methods:

Vincenty's formula (applied above)
  • 1761.575 miles
  • 2834.981 kilometers
  • 1530.767 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1757.718 miles
  • 2828.773 kilometers
  • 1527.415 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. John's to Hall Beach?

The estimated flight time from St. John's International Airport to Hall Beach Airport is 3 hours and 50 minutes.

Flight carbon footprint between St. John's International Airport (YYT) and Hall Beach Airport (YUX)

On average, flying from St. John's to Hall Beach generates about 197 kg of CO2 per passenger, and 197 kilograms equals 435 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from St. John's to Hall Beach

See the map of the shortest flight path between St. John's International Airport (YYT) and Hall Beach Airport (YUX).

Airport information

Origin St. John's International Airport
City: St. John's
Country: Canada Flag of Canada
IATA Code: YYT
ICAO Code: CYYT
Coordinates: 47°37′6″N, 52°45′6″W
Destination Hall Beach Airport
City: Hall Beach
Country: Canada Flag of Canada
IATA Code: YUX
ICAO Code: CYUX
Coordinates: 68°46′33″N, 81°14′36″W