Air Miles Calculator logo

How far is Birmingham from Charlotte, NC?

The distance between Charlotte (Charlotte Douglas International Airport) and Birmingham (Birmingham Airport) is 3920 miles / 6308 kilometers / 3406 nautical miles.

Charlotte Douglas International Airport – Birmingham Airport

Distance arrow
3920
Miles
Distance arrow
6308
Kilometers
Distance arrow
3406
Nautical miles

Search flights

Distance from Charlotte to Birmingham

There are several ways to calculate the distance from Charlotte to Birmingham. Here are two standard methods:

Vincenty's formula (applied above)
  • 3919.738 miles
  • 6308.207 kilometers
  • 3406.159 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3910.472 miles
  • 6293.295 kilometers
  • 3398.107 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charlotte to Birmingham?

The estimated flight time from Charlotte Douglas International Airport to Birmingham Airport is 7 hours and 55 minutes.

Flight carbon footprint between Charlotte Douglas International Airport (CLT) and Birmingham Airport (BHX)

On average, flying from Charlotte to Birmingham generates about 446 kg of CO2 per passenger, and 446 kilograms equals 984 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Charlotte to Birmingham

See the map of the shortest flight path between Charlotte Douglas International Airport (CLT) and Birmingham Airport (BHX).

Airport information

Origin Charlotte Douglas International Airport
City: Charlotte, NC
Country: United States Flag of United States
IATA Code: CLT
ICAO Code: KCLT
Coordinates: 35°12′50″N, 80°56′35″W
Destination Birmingham Airport
City: Birmingham
Country: United Kingdom Flag of United Kingdom
IATA Code: BHX
ICAO Code: EGBB
Coordinates: 52°27′14″N, 1°44′52″W