Air Miles Calculator logo

How far is San Juan from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and San Juan (Fernando Luis Ribas Dominicci Airport) is 1664 miles / 2677 kilometers / 1446 nautical miles.

Bradley International Airport – Fernando Luis Ribas Dominicci Airport

Distance arrow
1664
Miles
Distance arrow
2677
Kilometers
Distance arrow
1446
Nautical miles

Search flights

Distance from Windsor Locks to San Juan

There are several ways to calculate the distance from Windsor Locks to San Juan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1663.575 miles
  • 2677.265 kilometers
  • 1445.607 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1668.103 miles
  • 2684.551 kilometers
  • 1449.541 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to San Juan?

The estimated flight time from Bradley International Airport to Fernando Luis Ribas Dominicci Airport is 3 hours and 38 minutes.

What is the time difference between Windsor Locks and San Juan?

There is no time difference between Windsor Locks and San Juan.

Flight carbon footprint between Bradley International Airport (BDL) and Fernando Luis Ribas Dominicci Airport (SIG)

On average, flying from Windsor Locks to San Juan generates about 190 kg of CO2 per passenger, and 190 kilograms equals 419 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Windsor Locks to San Juan

See the map of the shortest flight path between Bradley International Airport (BDL) and Fernando Luis Ribas Dominicci Airport (SIG).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Fernando Luis Ribas Dominicci Airport
City: San Juan
Country: Puerto Rico Flag of Puerto Rico
IATA Code: SIG
ICAO Code: TJIG
Coordinates: 18°27′24″N, 66°5′53″W