Air Miles Calculator logo

How far is San Borja from Seattle, WA?

The distance between Seattle (Seattle–Tacoma International Airport) and San Borja (Capitán Germán Quiroga Guardia Airport) is 5489 miles / 8834 kilometers / 4770 nautical miles.

Seattle–Tacoma International Airport – Capitán Germán Quiroga Guardia Airport

Distance arrow
5489
Miles
Distance arrow
8834
Kilometers
Distance arrow
4770
Nautical miles

Search flights

Distance from Seattle to San Borja

There are several ways to calculate the distance from Seattle to San Borja. Here are two standard methods:

Vincenty's formula (applied above)
  • 5489.223 miles
  • 8834.048 kilometers
  • 4770.004 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5499.381 miles
  • 8850.396 kilometers
  • 4778.832 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Seattle to San Borja?

The estimated flight time from Seattle–Tacoma International Airport to Capitán Germán Quiroga Guardia Airport is 10 hours and 53 minutes.

Flight carbon footprint between Seattle–Tacoma International Airport (SEA) and Capitán Germán Quiroga Guardia Airport (SRJ)

On average, flying from Seattle to San Borja generates about 648 kg of CO2 per passenger, and 648 kilograms equals 1 429 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Seattle to San Borja

See the map of the shortest flight path between Seattle–Tacoma International Airport (SEA) and Capitán Germán Quiroga Guardia Airport (SRJ).

Airport information

Origin Seattle–Tacoma International Airport
City: Seattle, WA
Country: United States Flag of United States
IATA Code: SEA
ICAO Code: KSEA
Coordinates: 47°26′56″N, 122°18′32″W
Destination Capitán Germán Quiroga Guardia Airport
City: San Borja
Country: Bolivia Flag of Bolivia
IATA Code: SRJ
ICAO Code: SLSB
Coordinates: 14°51′33″S, 66°44′15″W