Air Miles Calculator logo

How far is Arctic Bay from St. John's?

The distance between St. John's (St. John's International Airport) and Arctic Bay (Arctic Bay Airport) is 2022 miles / 3254 kilometers / 1757 nautical miles.

St. John's International Airport – Arctic Bay Airport

Distance arrow
2022
Miles
Distance arrow
3254
Kilometers
Distance arrow
1757
Nautical miles
Flight time duration
4 h 19 min
Time Difference
2 h 30 min
CO2 emission
220 kg

Search flights

Distance from St. John's to Arctic Bay

There are several ways to calculate the distance from St. John's to Arctic Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 2021.864 miles
  • 3253.874 kilometers
  • 1756.951 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2017.193 miles
  • 3246.358 kilometers
  • 1752.893 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. John's to Arctic Bay?

The estimated flight time from St. John's International Airport to Arctic Bay Airport is 4 hours and 19 minutes.

Flight carbon footprint between St. John's International Airport (YYT) and Arctic Bay Airport (YAB)

On average, flying from St. John's to Arctic Bay generates about 220 kg of CO2 per passenger, and 220 kilograms equals 485 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from St. John's to Arctic Bay

See the map of the shortest flight path between St. John's International Airport (YYT) and Arctic Bay Airport (YAB).

Airport information

Origin St. John's International Airport
City: St. John's
Country: Canada Flag of Canada
IATA Code: YYT
ICAO Code: CYYT
Coordinates: 47°37′6″N, 52°45′6″W
Destination Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W