Air Miles Calculator logo

How far is Windsor Locks, CT, from Bullhead City, AZ?

The distance between Bullhead City (Laughlin/Bullhead International Airport) and Windsor Locks (Bradley International Airport) is 2292 miles / 3689 kilometers / 1992 nautical miles.

The driving distance from Bullhead City (IFP) to Windsor Locks (BDL) is 2628 miles / 4230 kilometers, and travel time by car is about 47 hours 16 minutes.

Laughlin/Bullhead International Airport – Bradley International Airport

Distance arrow
2292
Miles
Distance arrow
3689
Kilometers
Distance arrow
1992
Nautical miles

Search flights

Distance from Bullhead City to Windsor Locks

There are several ways to calculate the distance from Bullhead City to Windsor Locks. Here are two standard methods:

Vincenty's formula (applied above)
  • 2292.492 miles
  • 3689.409 kilometers
  • 1992.121 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2287.352 miles
  • 3681.137 kilometers
  • 1987.655 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bullhead City to Windsor Locks?

The estimated flight time from Laughlin/Bullhead International Airport to Bradley International Airport is 4 hours and 50 minutes.

Flight carbon footprint between Laughlin/Bullhead International Airport (IFP) and Bradley International Airport (BDL)

On average, flying from Bullhead City to Windsor Locks generates about 251 kg of CO2 per passenger, and 251 kilograms equals 553 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bullhead City to Windsor Locks

See the map of the shortest flight path between Laughlin/Bullhead International Airport (IFP) and Bradley International Airport (BDL).

Airport information

Origin Laughlin/Bullhead International Airport
City: Bullhead City, AZ
Country: United States Flag of United States
IATA Code: IFP
ICAO Code: KIFP
Coordinates: 35°9′26″N, 114°33′35″W
Destination Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W