Air Miles Calculator logo

How far is Salluit from Uranium City?

The distance between Uranium City (Uranium City Airport) and Salluit (Salluit Airport) is 1110 miles / 1786 kilometers / 965 nautical miles.

The driving distance from Uranium City (YBE) to Salluit (YZG) is 3047 miles / 4903 kilometers, and travel time by car is about 83 hours 50 minutes.

Uranium City Airport – Salluit Airport

Distance arrow
1110
Miles
Distance arrow
1786
Kilometers
Distance arrow
965
Nautical miles

Search flights

Distance from Uranium City to Salluit

There are several ways to calculate the distance from Uranium City to Salluit. Here are two standard methods:

Vincenty's formula (applied above)
  • 1109.991 miles
  • 1786.357 kilometers
  • 964.555 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1105.963 miles
  • 1779.875 kilometers
  • 961.056 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Uranium City to Salluit?

The estimated flight time from Uranium City Airport to Salluit Airport is 2 hours and 36 minutes.

Flight carbon footprint between Uranium City Airport (YBE) and Salluit Airport (YZG)

On average, flying from Uranium City to Salluit generates about 157 kg of CO2 per passenger, and 157 kilograms equals 347 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Uranium City to Salluit

See the map of the shortest flight path between Uranium City Airport (YBE) and Salluit Airport (YZG).

Airport information

Origin Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W
Destination Salluit Airport
City: Salluit
Country: Canada Flag of Canada
IATA Code: YZG
ICAO Code: CYZG
Coordinates: 62°10′45″N, 75°40′1″W