Air Miles Calculator logo

How far is Arctic Bay from Norway House?

The distance between Norway House (Norway House Airport) and Arctic Bay (Arctic Bay Airport) is 1370 miles / 2205 kilometers / 1191 nautical miles.

Norway House Airport – Arctic Bay Airport

Distance arrow
1370
Miles
Distance arrow
2205
Kilometers
Distance arrow
1191
Nautical miles

Search flights

Distance from Norway House to Arctic Bay

There are several ways to calculate the distance from Norway House to Arctic Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1370.389 miles
  • 2205.428 kilometers
  • 1190.836 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1366.961 miles
  • 2199.910 kilometers
  • 1187.856 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Norway House to Arctic Bay?

The estimated flight time from Norway House Airport to Arctic Bay Airport is 3 hours and 5 minutes.

What is the time difference between Norway House and Arctic Bay?

There is no time difference between Norway House and Arctic Bay.

Flight carbon footprint between Norway House Airport (YNE) and Arctic Bay Airport (YAB)

On average, flying from Norway House to Arctic Bay generates about 171 kg of CO2 per passenger, and 171 kilograms equals 378 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Norway House to Arctic Bay

See the map of the shortest flight path between Norway House Airport (YNE) and Arctic Bay Airport (YAB).

Airport information

Origin Norway House Airport
City: Norway House
Country: Canada Flag of Canada
IATA Code: YNE
ICAO Code: CYNE
Coordinates: 53°57′29″N, 97°50′39″W
Destination Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W