Air Miles Calculator logo

How far is Aupaluk from Iqaluit?

The distance between Iqaluit (Iqaluit Airport) and Aupaluk (Aupaluk Airport) is 311 miles / 500 kilometers / 270 nautical miles.

The driving distance from Iqaluit (YFB) to Aupaluk (YPJ) is 2616 miles / 4210 kilometers, and travel time by car is about 98 hours 14 minutes.

Iqaluit Airport – Aupaluk Airport

Distance arrow
311
Miles
Distance arrow
500
Kilometers
Distance arrow
270
Nautical miles

Search flights

Distance from Iqaluit to Aupaluk

There are several ways to calculate the distance from Iqaluit to Aupaluk. Here are two standard methods:

Vincenty's formula (applied above)
  • 310.720 miles
  • 500.055 kilometers
  • 270.008 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 310.038 miles
  • 498.958 kilometers
  • 269.416 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Iqaluit to Aupaluk?

The estimated flight time from Iqaluit Airport to Aupaluk Airport is 1 hour and 5 minutes.

What is the time difference between Iqaluit and Aupaluk?

There is no time difference between Iqaluit and Aupaluk.

Flight carbon footprint between Iqaluit Airport (YFB) and Aupaluk Airport (YPJ)

On average, flying from Iqaluit to Aupaluk generates about 71 kg of CO2 per passenger, and 71 kilograms equals 156 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Iqaluit to Aupaluk

See the map of the shortest flight path between Iqaluit Airport (YFB) and Aupaluk Airport (YPJ).

Airport information

Origin Iqaluit Airport
City: Iqaluit
Country: Canada Flag of Canada
IATA Code: YFB
ICAO Code: CYFB
Coordinates: 63°45′23″N, 68°33′20″W
Destination Aupaluk Airport
City: Aupaluk
Country: Canada Flag of Canada
IATA Code: YPJ
ICAO Code: CYLA
Coordinates: 59°17′48″N, 69°35′58″W