Air Miles Calculator logo

How far is Thunder Bay from Prince George?

The distance between Prince George (Prince George Airport) and Thunder Bay (Thunder Bay International Airport) is 1485 miles / 2390 kilometers / 1290 nautical miles.

The driving distance from Prince George (YXS) to Thunder Bay (YQT) is 1738 miles / 2797 kilometers, and travel time by car is about 35 hours 41 minutes.

Prince George Airport – Thunder Bay International Airport

Distance arrow
1485
Miles
Distance arrow
2390
Kilometers
Distance arrow
1290
Nautical miles

Search flights

Distance from Prince George to Thunder Bay

There are several ways to calculate the distance from Prince George to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1484.925 miles
  • 2389.754 kilometers
  • 1290.364 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1480.518 miles
  • 2382.662 kilometers
  • 1286.535 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Prince George to Thunder Bay?

The estimated flight time from Prince George Airport to Thunder Bay International Airport is 3 hours and 18 minutes.

Flight carbon footprint between Prince George Airport (YXS) and Thunder Bay International Airport (YQT)

On average, flying from Prince George to Thunder Bay generates about 179 kg of CO2 per passenger, and 179 kilograms equals 394 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Prince George to Thunder Bay

See the map of the shortest flight path between Prince George Airport (YXS) and Thunder Bay International Airport (YQT).

Airport information

Origin Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W