Air Miles Calculator logo

How far is Thunder Bay from Beatrice, NE?

The distance between Beatrice (Beatrice Municipal Airport) and Thunder Bay (Thunder Bay International Airport) is 667 miles / 1074 kilometers / 580 nautical miles.

The driving distance from Beatrice (BIE) to Thunder Bay (YQT) is 814 miles / 1310 kilometers, and travel time by car is about 15 hours 36 minutes.

Beatrice Municipal Airport – Thunder Bay International Airport

Distance arrow
667
Miles
Distance arrow
1074
Kilometers
Distance arrow
580
Nautical miles

Search flights

Distance from Beatrice to Thunder Bay

There are several ways to calculate the distance from Beatrice to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 667.183 miles
  • 1073.727 kilometers
  • 579.766 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 666.949 miles
  • 1073.350 kilometers
  • 579.563 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Beatrice to Thunder Bay?

The estimated flight time from Beatrice Municipal Airport to Thunder Bay International Airport is 1 hour and 45 minutes.

Flight carbon footprint between Beatrice Municipal Airport (BIE) and Thunder Bay International Airport (YQT)

On average, flying from Beatrice to Thunder Bay generates about 121 kg of CO2 per passenger, and 121 kilograms equals 266 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Beatrice to Thunder Bay

See the map of the shortest flight path between Beatrice Municipal Airport (BIE) and Thunder Bay International Airport (YQT).

Airport information

Origin Beatrice Municipal Airport
City: Beatrice, NE
Country: United States Flag of United States
IATA Code: BIE
ICAO Code: KBIE
Coordinates: 40°18′4″N, 96°45′14″W
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W