Air Miles Calculator logo

How far is Windsor from Hall Beach?

The distance between Hall Beach (Hall Beach Airport) and Windsor (Windsor International Airport) is 1834 miles / 2952 kilometers / 1594 nautical miles.

Hall Beach Airport – Windsor International Airport

Distance arrow
1834
Miles
Distance arrow
2952
Kilometers
Distance arrow
1594
Nautical miles

Search flights

Distance from Hall Beach to Windsor

There are several ways to calculate the distance from Hall Beach to Windsor. Here are two standard methods:

Vincenty's formula (applied above)
  • 1834.236 miles
  • 2951.916 kilometers
  • 1593.907 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1832.071 miles
  • 2948.432 kilometers
  • 1592.026 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hall Beach to Windsor?

The estimated flight time from Hall Beach Airport to Windsor International Airport is 3 hours and 58 minutes.

What is the time difference between Hall Beach and Windsor?

There is no time difference between Hall Beach and Windsor.

Flight carbon footprint between Hall Beach Airport (YUX) and Windsor International Airport (YQG)

On average, flying from Hall Beach to Windsor generates about 203 kg of CO2 per passenger, and 203 kilograms equals 447 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hall Beach to Windsor

See the map of the shortest flight path between Hall Beach Airport (YUX) and Windsor International Airport (YQG).

Airport information

Origin Hall Beach Airport
City: Hall Beach
Country: Canada Flag of Canada
IATA Code: YUX
ICAO Code: CYUX
Coordinates: 68°46′33″N, 81°14′36″W
Destination Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W