Air Miles Calculator logo

How far is Iqaluit from Prince George?

The distance between Prince George (Prince George Airport) and Iqaluit (Iqaluit Airport) is 1988 miles / 3200 kilometers / 1728 nautical miles.

The driving distance from Prince George (YXS) to Iqaluit (YFB) is 4343 miles / 6989 kilometers, and travel time by car is about 116 hours 57 minutes.

Prince George Airport – Iqaluit Airport

Distance arrow
1988
Miles
Distance arrow
3200
Kilometers
Distance arrow
1728
Nautical miles

Search flights

Distance from Prince George to Iqaluit

There are several ways to calculate the distance from Prince George to Iqaluit. Here are two standard methods:

Vincenty's formula (applied above)
  • 1988.099 miles
  • 3199.535 kilometers
  • 1727.610 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1981.461 miles
  • 3188.852 kilometers
  • 1721.842 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Prince George to Iqaluit?

The estimated flight time from Prince George Airport to Iqaluit Airport is 4 hours and 15 minutes.

Flight carbon footprint between Prince George Airport (YXS) and Iqaluit Airport (YFB)

On average, flying from Prince George to Iqaluit generates about 217 kg of CO2 per passenger, and 217 kilograms equals 478 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Prince George to Iqaluit

See the map of the shortest flight path between Prince George Airport (YXS) and Iqaluit Airport (YFB).

Airport information

Origin Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W
Destination Iqaluit Airport
City: Iqaluit
Country: Canada Flag of Canada
IATA Code: YFB
ICAO Code: CYFB
Coordinates: 63°45′23″N, 68°33′20″W