Air Miles Calculator logo

How far is Thompson from Arviat?

The distance between Arviat (Arviat Airport) and Thompson (Thompson Airport) is 391 miles / 630 kilometers / 340 nautical miles.

The driving distance from Arviat (YEK) to Thompson (YTH) is 197 miles / 317 kilometers, and travel time by car is about 7 hours 28 minutes.

Arviat Airport – Thompson Airport

Distance arrow
391
Miles
Distance arrow
630
Kilometers
Distance arrow
340
Nautical miles

Search flights

Distance from Arviat to Thompson

There are several ways to calculate the distance from Arviat to Thompson. Here are two standard methods:

Vincenty's formula (applied above)
  • 391.185 miles
  • 629.552 kilometers
  • 339.931 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 390.430 miles
  • 628.335 kilometers
  • 339.274 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Arviat to Thompson?

The estimated flight time from Arviat Airport to Thompson Airport is 1 hour and 14 minutes.

What is the time difference between Arviat and Thompson?

There is no time difference between Arviat and Thompson.

Flight carbon footprint between Arviat Airport (YEK) and Thompson Airport (YTH)

On average, flying from Arviat to Thompson generates about 83 kg of CO2 per passenger, and 83 kilograms equals 182 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Arviat to Thompson

See the map of the shortest flight path between Arviat Airport (YEK) and Thompson Airport (YTH).

Airport information

Origin Arviat Airport
City: Arviat
Country: Canada Flag of Canada
IATA Code: YEK
ICAO Code: CYEK
Coordinates: 61°5′39″N, 94°4′14″W
Destination Thompson Airport
City: Thompson
Country: Canada Flag of Canada
IATA Code: YTH
ICAO Code: CYTH
Coordinates: 55°48′3″N, 97°51′51″W