Air Miles Calculator logo

How far is Thunder Bay from North Platte, NE?

The distance between North Platte (North Platte Regional Airport) and Thunder Bay (Thunder Bay International Airport) is 749 miles / 1205 kilometers / 651 nautical miles.

The driving distance from North Platte (LBF) to Thunder Bay (YQT) is 990 miles / 1593 kilometers, and travel time by car is about 18 hours 24 minutes.

North Platte Regional Airport – Thunder Bay International Airport

Distance arrow
749
Miles
Distance arrow
1205
Kilometers
Distance arrow
651
Nautical miles

Search flights

Distance from North Platte to Thunder Bay

There are several ways to calculate the distance from North Platte to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 748.758 miles
  • 1205.009 kilometers
  • 650.653 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 747.815 miles
  • 1203.492 kilometers
  • 649.834 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from North Platte to Thunder Bay?

The estimated flight time from North Platte Regional Airport to Thunder Bay International Airport is 1 hour and 55 minutes.

Flight carbon footprint between North Platte Regional Airport (LBF) and Thunder Bay International Airport (YQT)

On average, flying from North Platte to Thunder Bay generates about 130 kg of CO2 per passenger, and 130 kilograms equals 286 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from North Platte to Thunder Bay

See the map of the shortest flight path between North Platte Regional Airport (LBF) and Thunder Bay International Airport (YQT).

Airport information

Origin North Platte Regional Airport
City: North Platte, NE
Country: United States Flag of United States
IATA Code: LBF
ICAO Code: KLBF
Coordinates: 41°7′34″N, 100°41′2″W
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W