Air Miles Calculator logo

How far is Philadelphia, PA, from Santiago?

The distance between Santiago (Santiago International Airport) and Philadelphia (Wings Field) is 5067 miles / 8154 kilometers / 4403 nautical miles.

Santiago International Airport – Wings Field

Distance arrow
5067
Miles
Distance arrow
8154
Kilometers
Distance arrow
4403
Nautical miles

Search flights

Distance from Santiago to Philadelphia

There are several ways to calculate the distance from Santiago to Philadelphia. Here are two standard methods:

Vincenty's formula (applied above)
  • 5066.770 miles
  • 8154.175 kilometers
  • 4402.902 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5088.520 miles
  • 8189.178 kilometers
  • 4421.803 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santiago to Philadelphia?

The estimated flight time from Santiago International Airport to Wings Field is 10 hours and 5 minutes.

What is the time difference between Santiago and Philadelphia?

There is no time difference between Santiago and Philadelphia.

Flight carbon footprint between Santiago International Airport (SCL) and Wings Field (BBX)

On average, flying from Santiago to Philadelphia generates about 593 kg of CO2 per passenger, and 593 kilograms equals 1 306 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Santiago to Philadelphia

See the map of the shortest flight path between Santiago International Airport (SCL) and Wings Field (BBX).

Airport information

Origin Santiago International Airport
City: Santiago
Country: Chile Flag of Chile
IATA Code: SCL
ICAO Code: SCEL
Coordinates: 33°23′34″S, 70°47′8″W
Destination Wings Field
City: Philadelphia, PA
Country: United States Flag of United States
IATA Code: BBX
ICAO Code: KLOM
Coordinates: 40°8′15″N, 75°15′54″W