Air Miles Calculator logo

How far is Iqaluit from Aupaluk?

The distance between Aupaluk (Aupaluk Airport) and Iqaluit (Iqaluit Airport) is 311 miles / 500 kilometers / 270 nautical miles.

The driving distance from Aupaluk (YPJ) to Iqaluit (YFB) is 2616 miles / 4210 kilometers, and travel time by car is about 98 hours 15 minutes.

Aupaluk Airport – Iqaluit Airport

Distance arrow
311
Miles
Distance arrow
500
Kilometers
Distance arrow
270
Nautical miles

Search flights

Distance from Aupaluk to Iqaluit

There are several ways to calculate the distance from Aupaluk to Iqaluit. Here are two standard methods:

Vincenty's formula (applied above)
  • 310.720 miles
  • 500.055 kilometers
  • 270.008 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 310.038 miles
  • 498.958 kilometers
  • 269.416 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Aupaluk to Iqaluit?

The estimated flight time from Aupaluk Airport to Iqaluit Airport is 1 hour and 5 minutes.

What is the time difference between Aupaluk and Iqaluit?

There is no time difference between Aupaluk and Iqaluit.

Flight carbon footprint between Aupaluk Airport (YPJ) and Iqaluit Airport (YFB)

On average, flying from Aupaluk to Iqaluit generates about 71 kg of CO2 per passenger, and 71 kilograms equals 156 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Aupaluk to Iqaluit

See the map of the shortest flight path between Aupaluk Airport (YPJ) and Iqaluit Airport (YFB).

Airport information

Origin Aupaluk Airport
City: Aupaluk
Country: Canada Flag of Canada
IATA Code: YPJ
ICAO Code: CYLA
Coordinates: 59°17′48″N, 69°35′58″W
Destination Iqaluit Airport
City: Iqaluit
Country: Canada Flag of Canada
IATA Code: YFB
ICAO Code: CYFB
Coordinates: 63°45′23″N, 68°33′20″W