Air Miles Calculator logo

How far is Terrace from West Palm Beach, FL?

The distance between West Palm Beach (Palm Beach International Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 3098 miles / 4986 kilometers / 2692 nautical miles.

The driving distance from West Palm Beach (PBI) to Terrace (YXT) is 3808 miles / 6128 kilometers, and travel time by car is about 72 hours 21 minutes.

Palm Beach International Airport – Northwest Regional Airport Terrace-Kitimat

Distance arrow
3098
Miles
Distance arrow
4986
Kilometers
Distance arrow
2692
Nautical miles

Search flights

Distance from West Palm Beach to Terrace

There are several ways to calculate the distance from West Palm Beach to Terrace. Here are two standard methods:

Vincenty's formula (applied above)
  • 3098.145 miles
  • 4985.981 kilometers
  • 2692.214 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3094.964 miles
  • 4980.861 kilometers
  • 2689.450 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from West Palm Beach to Terrace?

The estimated flight time from Palm Beach International Airport to Northwest Regional Airport Terrace-Kitimat is 6 hours and 21 minutes.

Flight carbon footprint between Palm Beach International Airport (PBI) and Northwest Regional Airport Terrace-Kitimat (YXT)

On average, flying from West Palm Beach to Terrace generates about 346 kg of CO2 per passenger, and 346 kilograms equals 763 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from West Palm Beach to Terrace

See the map of the shortest flight path between Palm Beach International Airport (PBI) and Northwest Regional Airport Terrace-Kitimat (YXT).

Airport information

Origin Palm Beach International Airport
City: West Palm Beach, FL
Country: United States Flag of United States
IATA Code: PBI
ICAO Code: KPBI
Coordinates: 26°40′59″N, 80°5′44″W
Destination Northwest Regional Airport Terrace-Kitimat
City: Terrace
Country: Canada Flag of Canada
IATA Code: YXT
ICAO Code: CYXT
Coordinates: 54°28′6″N, 128°34′33″W