Air Miles Calculator logo

How far is Aupaluk from Wollaston Lake?

The distance between Wollaston Lake (Wollaston Lake Airport) and Aupaluk (Aupaluk Airport) is 1199 miles / 1930 kilometers / 1042 nautical miles.

The driving distance from Wollaston Lake (ZWL) to Aupaluk (YPJ) is 2851 miles / 4588 kilometers, and travel time by car is about 80 hours 22 minutes.

Wollaston Lake Airport – Aupaluk Airport

Distance arrow
1199
Miles
Distance arrow
1930
Kilometers
Distance arrow
1042
Nautical miles

Search flights

Distance from Wollaston Lake to Aupaluk

There are several ways to calculate the distance from Wollaston Lake to Aupaluk. Here are two standard methods:

Vincenty's formula (applied above)
  • 1199.259 miles
  • 1930.021 kilometers
  • 1042.128 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1194.999 miles
  • 1923.164 kilometers
  • 1038.425 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wollaston Lake to Aupaluk?

The estimated flight time from Wollaston Lake Airport to Aupaluk Airport is 2 hours and 46 minutes.

Flight carbon footprint between Wollaston Lake Airport (ZWL) and Aupaluk Airport (YPJ)

On average, flying from Wollaston Lake to Aupaluk generates about 161 kg of CO2 per passenger, and 161 kilograms equals 356 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wollaston Lake to Aupaluk

See the map of the shortest flight path between Wollaston Lake Airport (ZWL) and Aupaluk Airport (YPJ).

Airport information

Origin Wollaston Lake Airport
City: Wollaston Lake
Country: Canada Flag of Canada
IATA Code: ZWL
ICAO Code: CZWL
Coordinates: 58°6′24″N, 103°10′19″W
Destination Aupaluk Airport
City: Aupaluk
Country: Canada Flag of Canada
IATA Code: YPJ
ICAO Code: CYLA
Coordinates: 59°17′48″N, 69°35′58″W