Air Miles Calculator logo

How far is Thunder Bay from West Palm Beach, FL?

The distance between West Palm Beach (Palm Beach International Airport) and Thunder Bay (Thunder Bay International Airport) is 1577 miles / 2537 kilometers / 1370 nautical miles.

The driving distance from West Palm Beach (PBI) to Thunder Bay (YQT) is 1980 miles / 3187 kilometers, and travel time by car is about 37 hours 5 minutes.

Palm Beach International Airport – Thunder Bay International Airport

Distance arrow
1577
Miles
Distance arrow
2537
Kilometers
Distance arrow
1370
Nautical miles

Search flights

Distance from West Palm Beach to Thunder Bay

There are several ways to calculate the distance from West Palm Beach to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1576.557 miles
  • 2537.222 kilometers
  • 1369.990 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1578.798 miles
  • 2540.829 kilometers
  • 1371.938 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from West Palm Beach to Thunder Bay?

The estimated flight time from Palm Beach International Airport to Thunder Bay International Airport is 3 hours and 29 minutes.

What is the time difference between West Palm Beach and Thunder Bay?

There is no time difference between West Palm Beach and Thunder Bay.

Flight carbon footprint between Palm Beach International Airport (PBI) and Thunder Bay International Airport (YQT)

On average, flying from West Palm Beach to Thunder Bay generates about 184 kg of CO2 per passenger, and 184 kilograms equals 406 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from West Palm Beach to Thunder Bay

See the map of the shortest flight path between Palm Beach International Airport (PBI) and Thunder Bay International Airport (YQT).

Airport information

Origin Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W