Air Miles Calculator logo

How far is Tampa, FL, from Bellingham, WA?

The distance between Bellingham (Bellingham International Airport) and Tampa (Tampa International Airport) is 2560 miles / 4119 kilometers / 2224 nautical miles.

The driving distance from Bellingham (BLI) to Tampa (TPA) is 3214 miles / 5173 kilometers, and travel time by car is about 57 hours 0 minutes.

Bellingham International Airport – Tampa International Airport

Distance arrow
2560
Miles
Distance arrow
4119
Kilometers
Distance arrow
2224
Nautical miles

Search flights

Distance from Bellingham to Tampa

There are several ways to calculate the distance from Bellingham to Tampa. Here are two standard methods:

Vincenty's formula (applied above)
  • 2559.694 miles
  • 4119.429 kilometers
  • 2224.314 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2556.882 miles
  • 4114.903 kilometers
  • 2221.870 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bellingham to Tampa?

The estimated flight time from Bellingham International Airport to Tampa International Airport is 5 hours and 20 minutes.

Flight carbon footprint between Bellingham International Airport (BLI) and Tampa International Airport (TPA)

On average, flying from Bellingham to Tampa generates about 282 kg of CO2 per passenger, and 282 kilograms equals 622 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bellingham to Tampa

See the map of the shortest flight path between Bellingham International Airport (BLI) and Tampa International Airport (TPA).

Airport information

Origin Bellingham International Airport
City: Bellingham, WA
Country: United States Flag of United States
IATA Code: BLI
ICAO Code: KBLI
Coordinates: 48°47′34″N, 122°32′16″W
Destination Tampa International Airport
City: Tampa, FL
Country: United States Flag of United States
IATA Code: TPA
ICAO Code: KTPA
Coordinates: 27°58′31″N, 82°31′59″W