Air Miles Calculator logo

How far is Texada from Aupaluk?

The distance between Aupaluk (Aupaluk Airport) and Texada (Texada/Gillies Bay Airport) is 2237 miles / 3600 kilometers / 1944 nautical miles.

The driving distance from Aupaluk (YPJ) to Texada (YGB) is 3592 miles / 5780 kilometers, and travel time by car is about 88 hours 31 minutes.

Aupaluk Airport – Texada/Gillies Bay Airport

Distance arrow
2237
Miles
Distance arrow
3600
Kilometers
Distance arrow
1944
Nautical miles

Search flights

Distance from Aupaluk to Texada

There are several ways to calculate the distance from Aupaluk to Texada. Here are two standard methods:

Vincenty's formula (applied above)
  • 2236.690 miles
  • 3599.604 kilometers
  • 1943.631 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2229.708 miles
  • 3588.368 kilometers
  • 1937.564 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Aupaluk to Texada?

The estimated flight time from Aupaluk Airport to Texada/Gillies Bay Airport is 4 hours and 44 minutes.

Flight carbon footprint between Aupaluk Airport (YPJ) and Texada/Gillies Bay Airport (YGB)

On average, flying from Aupaluk to Texada generates about 245 kg of CO2 per passenger, and 245 kilograms equals 539 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Aupaluk to Texada

See the map of the shortest flight path between Aupaluk Airport (YPJ) and Texada/Gillies Bay Airport (YGB).

Airport information

Origin Aupaluk Airport
City: Aupaluk
Country: Canada Flag of Canada
IATA Code: YPJ
ICAO Code: CYLA
Coordinates: 59°17′48″N, 69°35′58″W
Destination Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W