Air Miles Calculator logo

How far is Arctic Bay from Rankin Inlet?

The distance between Rankin Inlet (Rankin Inlet Airport) and Arctic Bay (Arctic Bay Airport) is 729 miles / 1173 kilometers / 633 nautical miles.

Rankin Inlet Airport – Arctic Bay Airport

Distance arrow
729
Miles
Distance arrow
1173
Kilometers
Distance arrow
633
Nautical miles

Search flights

Distance from Rankin Inlet to Arctic Bay

There are several ways to calculate the distance from Rankin Inlet to Arctic Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 728.992 miles
  • 1173.198 kilometers
  • 633.476 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 726.745 miles
  • 1169.583 kilometers
  • 631.524 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Rankin Inlet to Arctic Bay?

The estimated flight time from Rankin Inlet Airport to Arctic Bay Airport is 1 hour and 52 minutes.

What is the time difference between Rankin Inlet and Arctic Bay?

There is no time difference between Rankin Inlet and Arctic Bay.

Flight carbon footprint between Rankin Inlet Airport (YRT) and Arctic Bay Airport (YAB)

On average, flying from Rankin Inlet to Arctic Bay generates about 128 kg of CO2 per passenger, and 128 kilograms equals 282 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Rankin Inlet to Arctic Bay

See the map of the shortest flight path between Rankin Inlet Airport (YRT) and Arctic Bay Airport (YAB).

Airport information

Origin Rankin Inlet Airport
City: Rankin Inlet
Country: Canada Flag of Canada
IATA Code: YRT
ICAO Code: CYRT
Coordinates: 62°48′41″N, 92°6′56″W
Destination Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W