Air Miles Calculator logo

How far is Gamètì from Sanikiluaq?

The distance between Sanikiluaq (Sanikiluaq Airport) and Gamètì (Gamètì/Rae Lakes Airport) is 1382 miles / 2224 kilometers / 1201 nautical miles.

The driving distance from Sanikiluaq (YSK) to Gamètì (YRA) is 3404 miles / 5479 kilometers, and travel time by car is about 74 hours 0 minutes.

Sanikiluaq Airport – Gamètì/Rae Lakes Airport

Distance arrow
1382
Miles
Distance arrow
2224
Kilometers
Distance arrow
1201
Nautical miles

Search flights

Distance from Sanikiluaq to Gamètì

There are several ways to calculate the distance from Sanikiluaq to Gamètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 1382.139 miles
  • 2224.338 kilometers
  • 1201.046 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1377.443 miles
  • 2216.780 kilometers
  • 1196.965 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sanikiluaq to Gamètì?

The estimated flight time from Sanikiluaq Airport to Gamètì/Rae Lakes Airport is 3 hours and 7 minutes.

Flight carbon footprint between Sanikiluaq Airport (YSK) and Gamètì/Rae Lakes Airport (YRA)

On average, flying from Sanikiluaq to Gamètì generates about 172 kg of CO2 per passenger, and 172 kilograms equals 379 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sanikiluaq to Gamètì

See the map of the shortest flight path between Sanikiluaq Airport (YSK) and Gamètì/Rae Lakes Airport (YRA).

Airport information

Origin Sanikiluaq Airport
City: Sanikiluaq
Country: Canada Flag of Canada
IATA Code: YSK
ICAO Code: CYSK
Coordinates: 56°32′16″N, 79°14′48″W
Destination Gamètì/Rae Lakes Airport
City: Gamètì
Country: Canada Flag of Canada
IATA Code: YRA
ICAO Code: CYRA
Coordinates: 64°6′57″N, 117°18′35″W