Air Miles Calculator logo

How far is Thunder Bay from Deer Lake First Nation?

The distance between Deer Lake First Nation (Deer Lake Airport) and Thunder Bay (Thunder Bay International Airport) is 362 miles / 583 kilometers / 315 nautical miles.

Deer Lake Airport – Thunder Bay International Airport

Distance arrow
362
Miles
Distance arrow
583
Kilometers
Distance arrow
315
Nautical miles

Search flights

Distance from Deer Lake First Nation to Thunder Bay

There are several ways to calculate the distance from Deer Lake First Nation to Thunder Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 362.146 miles
  • 582.817 kilometers
  • 314.696 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 361.678 miles
  • 582.064 kilometers
  • 314.289 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Deer Lake First Nation to Thunder Bay?

The estimated flight time from Deer Lake Airport to Thunder Bay International Airport is 1 hour and 11 minutes.

Flight carbon footprint between Deer Lake Airport (YVZ) and Thunder Bay International Airport (YQT)

On average, flying from Deer Lake First Nation to Thunder Bay generates about 78 kg of CO2 per passenger, and 78 kilograms equals 173 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Deer Lake First Nation to Thunder Bay

See the map of the shortest flight path between Deer Lake Airport (YVZ) and Thunder Bay International Airport (YQT).

Airport information

Origin Deer Lake Airport
City: Deer Lake First Nation
Country: Canada Flag of Canada
IATA Code: YVZ
ICAO Code: CYVZ
Coordinates: 52°39′20″N, 94°3′41″W
Destination Thunder Bay International Airport
City: Thunder Bay
Country: Canada Flag of Canada
IATA Code: YQT
ICAO Code: CYQT
Coordinates: 48°22′18″N, 89°19′26″W