Air Miles Calculator logo

How far is Hall Beach from Kingston?

The distance between Kingston (Kingston Norman Rogers Airport) and Hall Beach (Hall Beach Airport) is 1707 miles / 2747 kilometers / 1483 nautical miles.

Kingston Norman Rogers Airport – Hall Beach Airport

Distance arrow
1707
Miles
Distance arrow
2747
Kilometers
Distance arrow
1483
Nautical miles

Search flights

Distance from Kingston to Hall Beach

There are several ways to calculate the distance from Kingston to Hall Beach. Here are two standard methods:

Vincenty's formula (applied above)
  • 1706.725 miles
  • 2746.708 kilometers
  • 1483.104 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1704.405 miles
  • 2742.973 kilometers
  • 1481.087 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kingston to Hall Beach?

The estimated flight time from Kingston Norman Rogers Airport to Hall Beach Airport is 3 hours and 43 minutes.

What is the time difference between Kingston and Hall Beach?

There is no time difference between Kingston and Hall Beach.

Flight carbon footprint between Kingston Norman Rogers Airport (YGK) and Hall Beach Airport (YUX)

On average, flying from Kingston to Hall Beach generates about 193 kg of CO2 per passenger, and 193 kilograms equals 426 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Kingston to Hall Beach

See the map of the shortest flight path between Kingston Norman Rogers Airport (YGK) and Hall Beach Airport (YUX).

Airport information

Origin Kingston Norman Rogers Airport
City: Kingston
Country: Canada Flag of Canada
IATA Code: YGK
ICAO Code: CYGK
Coordinates: 44°13′31″N, 76°35′48″W
Destination Hall Beach Airport
City: Hall Beach
Country: Canada Flag of Canada
IATA Code: YUX
ICAO Code: CYUX
Coordinates: 68°46′33″N, 81°14′36″W