Air Miles Calculator logo

How far is Arctic Bay from Port Hope Simpson?

The distance between Port Hope Simpson (Port Hope Simpson Airport) and Arctic Bay (Arctic Bay Airport) is 1650 miles / 2655 kilometers / 1433 nautical miles.

Port Hope Simpson Airport – Arctic Bay Airport

Distance arrow
1650
Miles
Distance arrow
2655
Kilometers
Distance arrow
1433
Nautical miles
Flight time duration
3 h 37 min
Time Difference
2 h 30 min
CO2 emission
189 kg

Search flights

Distance from Port Hope Simpson to Arctic Bay

There are several ways to calculate the distance from Port Hope Simpson to Arctic Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1649.596 miles
  • 2654.768 kilometers
  • 1433.460 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1645.209 miles
  • 2647.707 kilometers
  • 1429.648 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Port Hope Simpson to Arctic Bay?

The estimated flight time from Port Hope Simpson Airport to Arctic Bay Airport is 3 hours and 37 minutes.

Flight carbon footprint between Port Hope Simpson Airport (YHA) and Arctic Bay Airport (YAB)

On average, flying from Port Hope Simpson to Arctic Bay generates about 189 kg of CO2 per passenger, and 189 kilograms equals 417 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Port Hope Simpson to Arctic Bay

See the map of the shortest flight path between Port Hope Simpson Airport (YHA) and Arctic Bay Airport (YAB).

Airport information

Origin Port Hope Simpson Airport
City: Port Hope Simpson
Country: Canada Flag of Canada
IATA Code: YHA
ICAO Code: CCP4
Coordinates: 52°31′41″N, 56°17′9″W
Destination Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W