Air Miles Calculator logo

How far is Hall Beach from Stephenville?

The distance between Stephenville (Stephenville International Airport) and Hall Beach (Hall Beach Airport) is 1600 miles / 2575 kilometers / 1391 nautical miles.

Stephenville International Airport – Hall Beach Airport

Distance arrow
1600
Miles
Distance arrow
2575
Kilometers
Distance arrow
1391
Nautical miles
Flight time duration
3 h 31 min
Time Difference
1 h 30 min
CO2 emission
186 kg

Search flights

Distance from Stephenville to Hall Beach

There are several ways to calculate the distance from Stephenville to Hall Beach. Here are two standard methods:

Vincenty's formula (applied above)
  • 1600.208 miles
  • 2575.286 kilometers
  • 1390.543 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1596.821 miles
  • 2569.835 kilometers
  • 1387.600 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Stephenville to Hall Beach?

The estimated flight time from Stephenville International Airport to Hall Beach Airport is 3 hours and 31 minutes.

Flight carbon footprint between Stephenville International Airport (YJT) and Hall Beach Airport (YUX)

On average, flying from Stephenville to Hall Beach generates about 186 kg of CO2 per passenger, and 186 kilograms equals 410 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Stephenville to Hall Beach

See the map of the shortest flight path between Stephenville International Airport (YJT) and Hall Beach Airport (YUX).

Airport information

Origin Stephenville International Airport
City: Stephenville
Country: Canada Flag of Canada
IATA Code: YJT
ICAO Code: CYJT
Coordinates: 48°32′39″N, 58°32′59″W
Destination Hall Beach Airport
City: Hall Beach
Country: Canada Flag of Canada
IATA Code: YUX
ICAO Code: CYUX
Coordinates: 68°46′33″N, 81°14′36″W