Air Miles Calculator logo

How far is Bullhead City, AZ, from Windsor Locks, CT?

The distance between Windsor Locks (Bradley International Airport) and Bullhead City (Laughlin/Bullhead International Airport) is 2292 miles / 3689 kilometers / 1992 nautical miles.

The driving distance from Windsor Locks (BDL) to Bullhead City (IFP) is 2629 miles / 4231 kilometers, and travel time by car is about 47 hours 13 minutes.

Bradley International Airport – Laughlin/Bullhead International Airport

Distance arrow
2292
Miles
Distance arrow
3689
Kilometers
Distance arrow
1992
Nautical miles

Search flights

Distance from Windsor Locks to Bullhead City

There are several ways to calculate the distance from Windsor Locks to Bullhead City. Here are two standard methods:

Vincenty's formula (applied above)
  • 2292.492 miles
  • 3689.409 kilometers
  • 1992.121 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2287.352 miles
  • 3681.137 kilometers
  • 1987.655 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windsor Locks to Bullhead City?

The estimated flight time from Bradley International Airport to Laughlin/Bullhead International Airport is 4 hours and 50 minutes.

Flight carbon footprint between Bradley International Airport (BDL) and Laughlin/Bullhead International Airport (IFP)

On average, flying from Windsor Locks to Bullhead City generates about 251 kg of CO2 per passenger, and 251 kilograms equals 553 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Windsor Locks to Bullhead City

See the map of the shortest flight path between Bradley International Airport (BDL) and Laughlin/Bullhead International Airport (IFP).

Airport information

Origin Bradley International Airport
City: Windsor Locks, CT
Country: United States Flag of United States
IATA Code: BDL
ICAO Code: KBDL
Coordinates: 41°56′20″N, 72°40′59″W
Destination Laughlin/Bullhead International Airport
City: Bullhead City, AZ
Country: United States Flag of United States
IATA Code: IFP
ICAO Code: KIFP
Coordinates: 35°9′26″N, 114°33′35″W