Air Miles Calculator logo

How far is Terrace from San Antonio, TX?

The distance between San Antonio (San Antonio International Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 2282 miles / 3672 kilometers / 1983 nautical miles.

The driving distance from San Antonio (SAT) to Terrace (YXT) is 2856 miles / 4597 kilometers, and travel time by car is about 55 hours 17 minutes.

San Antonio International Airport – Northwest Regional Airport Terrace-Kitimat

Distance arrow
2282
Miles
Distance arrow
3672
Kilometers
Distance arrow
1983
Nautical miles

Search flights

Distance from San Antonio to Terrace

There are several ways to calculate the distance from San Antonio to Terrace. Here are two standard methods:

Vincenty's formula (applied above)
  • 2281.898 miles
  • 3672.359 kilometers
  • 1982.915 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2280.770 miles
  • 3670.544 kilometers
  • 1981.935 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Antonio to Terrace?

The estimated flight time from San Antonio International Airport to Northwest Regional Airport Terrace-Kitimat is 4 hours and 49 minutes.

Flight carbon footprint between San Antonio International Airport (SAT) and Northwest Regional Airport Terrace-Kitimat (YXT)

On average, flying from San Antonio to Terrace generates about 250 kg of CO2 per passenger, and 250 kilograms equals 551 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from San Antonio to Terrace

See the map of the shortest flight path between San Antonio International Airport (SAT) and Northwest Regional Airport Terrace-Kitimat (YXT).

Airport information

Origin San Antonio International Airport
City: San Antonio, TX
Country: United States Flag of United States
IATA Code: SAT
ICAO Code: KSAT
Coordinates: 29°32′1″N, 98°28′11″W
Destination Northwest Regional Airport Terrace-Kitimat
City: Terrace
Country: Canada Flag of Canada
IATA Code: YXT
ICAO Code: CYXT
Coordinates: 54°28′6″N, 128°34′33″W