Air Miles Calculator logo

How far is Summer Beaver from Thompson?

The distance between Thompson (Thompson Airport) and Summer Beaver (Summer Beaver Airport) is 433 miles / 697 kilometers / 377 nautical miles.

The driving distance from Thompson (YTH) to Summer Beaver (SUR) is 964 miles / 1552 kilometers, and travel time by car is about 23 hours 19 minutes.

Thompson Airport – Summer Beaver Airport

Distance arrow
433
Miles
Distance arrow
697
Kilometers
Distance arrow
377
Nautical miles

Search flights

Distance from Thompson to Summer Beaver

There are several ways to calculate the distance from Thompson to Summer Beaver. Here are two standard methods:

Vincenty's formula (applied above)
  • 433.401 miles
  • 697.492 kilometers
  • 376.615 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 432.204 miles
  • 695.565 kilometers
  • 375.575 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Thompson to Summer Beaver?

The estimated flight time from Thompson Airport to Summer Beaver Airport is 1 hour and 19 minutes.

Flight carbon footprint between Thompson Airport (YTH) and Summer Beaver Airport (SUR)

On average, flying from Thompson to Summer Beaver generates about 89 kg of CO2 per passenger, and 89 kilograms equals 196 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Thompson to Summer Beaver

See the map of the shortest flight path between Thompson Airport (YTH) and Summer Beaver Airport (SUR).

Airport information

Origin Thompson Airport
City: Thompson
Country: Canada Flag of Canada
IATA Code: YTH
ICAO Code: CYTH
Coordinates: 55°48′3″N, 97°51′51″W
Destination Summer Beaver Airport
City: Summer Beaver
Country: Canada Flag of Canada
IATA Code: SUR
ICAO Code: CJV7
Coordinates: 52°42′30″N, 88°32′30″W