Air Miles Calculator logo

How far is Churchill from Summer Beaver?

The distance between Summer Beaver (Summer Beaver Airport) and Churchill (Churchill Airport) is 469 miles / 755 kilometers / 408 nautical miles.

The driving distance from Summer Beaver (SUR) to Churchill (YYQ) is 1159 miles / 1866 kilometers, and travel time by car is about 30 hours 42 minutes.

Summer Beaver Airport – Churchill Airport

Distance arrow
469
Miles
Distance arrow
755
Kilometers
Distance arrow
408
Nautical miles

Search flights

Distance from Summer Beaver to Churchill

There are several ways to calculate the distance from Summer Beaver to Churchill. Here are two standard methods:

Vincenty's formula (applied above)
  • 469.275 miles
  • 755.225 kilometers
  • 407.789 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 468.470 miles
  • 753.929 kilometers
  • 407.089 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Summer Beaver to Churchill?

The estimated flight time from Summer Beaver Airport to Churchill Airport is 1 hour and 23 minutes.

Flight carbon footprint between Summer Beaver Airport (SUR) and Churchill Airport (YYQ)

On average, flying from Summer Beaver to Churchill generates about 94 kg of CO2 per passenger, and 94 kilograms equals 207 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Summer Beaver to Churchill

See the map of the shortest flight path between Summer Beaver Airport (SUR) and Churchill Airport (YYQ).

Airport information

Origin Summer Beaver Airport
City: Summer Beaver
Country: Canada Flag of Canada
IATA Code: SUR
ICAO Code: CJV7
Coordinates: 52°42′30″N, 88°32′30″W
Destination Churchill Airport
City: Churchill
Country: Canada Flag of Canada
IATA Code: YYQ
ICAO Code: CYYQ
Coordinates: 58°44′21″N, 94°3′54″W