Air Miles Calculator logo

How far is Poplar Hill from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and Poplar Hill (Poplar Hill Airport) is 1230 miles / 1980 kilometers / 1069 nautical miles.

The driving distance from Windsor Locks (BDL) to Poplar Hill (YHP) is 1787 miles / 2876 kilometers, and travel time by car is about 38 hours 39 minutes.

Bradley International Airport – Poplar Hill Airport

Distance arrow
1230
Miles
Distance arrow
1980
Kilometers
Distance arrow
1069
Nautical miles

Search flights

Distance from Windsor Locks to Poplar Hill

There are several ways to calculate the distance from Windsor Locks to Poplar Hill. Here are two standard methods:

Vincenty's formula (applied above)
  • 1230.263 miles
  • 1979.916 kilometers
  • 1069.069 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1227.948 miles
  • 1976.191 kilometers
  • 1067.058 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to Poplar Hill?

The estimated flight time from Bradley International Airport to Poplar Hill Airport is 2 hours and 49 minutes.

Flight carbon footprint between Bradley International Airport (BDL) and Poplar Hill Airport (YHP)

On average, flying from Windsor Locks to Poplar Hill generates about 163 kg of CO2 per passenger, and 163 kilograms equals 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Windsor Locks to Poplar Hill

See the map of the shortest flight path between Bradley International Airport (BDL) and Poplar Hill Airport (YHP).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Poplar Hill Airport
City: Poplar Hill
Country: Canada Flag of Canada
IATA Code: YHP
ICAO Code: CPV7
Coordinates: 52°6′47″N, 94°15′20″W