Air Miles Calculator logo

How far is Prince George from Quaqtaq?

The distance between Quaqtaq (Quaqtaq Airport) and Prince George (Prince George Airport) is 1980 miles / 3186 kilometers / 1720 nautical miles.

The driving distance from Quaqtaq (YQC) to Prince George (YXS) is 3189 miles / 5133 kilometers, and travel time by car is about 82 hours 17 minutes.

Quaqtaq Airport – Prince George Airport

Distance arrow
1980
Miles
Distance arrow
3186
Kilometers
Distance arrow
1720
Nautical miles

Search flights

Distance from Quaqtaq to Prince George

There are several ways to calculate the distance from Quaqtaq to Prince George. Here are two standard methods:

Vincenty's formula (applied above)
  • 1979.602 miles
  • 3185.860 kilometers
  • 1720.227 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1972.938 miles
  • 3175.136 kilometers
  • 1714.436 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Quaqtaq to Prince George?

The estimated flight time from Quaqtaq Airport to Prince George Airport is 4 hours and 14 minutes.

Flight carbon footprint between Quaqtaq Airport (YQC) and Prince George Airport (YXS)

On average, flying from Quaqtaq to Prince George generates about 216 kg of CO2 per passenger, and 216 kilograms equals 476 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Quaqtaq to Prince George

See the map of the shortest flight path between Quaqtaq Airport (YQC) and Prince George Airport (YXS).

Airport information

Origin Quaqtaq Airport
City: Quaqtaq
Country: Canada Flag of Canada
IATA Code: YQC
ICAO Code: CYHA
Coordinates: 61°2′47″N, 69°37′4″W
Destination Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W