Air Miles Calculator logo

How far is Arviat from Thompson?

The distance between Thompson (Thompson Airport) and Arviat (Arviat Airport) is 391 miles / 630 kilometers / 340 nautical miles.

The driving distance from Thompson (YTH) to Arviat (YEK) is 197 miles / 317 kilometers, and travel time by car is about 7 hours 28 minutes.

Thompson Airport – Arviat Airport

Distance arrow
391
Miles
Distance arrow
630
Kilometers
Distance arrow
340
Nautical miles

Search flights

Distance from Thompson to Arviat

There are several ways to calculate the distance from Thompson to Arviat. Here are two standard methods:

Vincenty's formula (applied above)
  • 391.185 miles
  • 629.552 kilometers
  • 339.931 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 390.430 miles
  • 628.335 kilometers
  • 339.274 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Thompson to Arviat?

The estimated flight time from Thompson Airport to Arviat Airport is 1 hour and 14 minutes.

What is the time difference between Thompson and Arviat?

There is no time difference between Thompson and Arviat.

Flight carbon footprint between Thompson Airport (YTH) and Arviat Airport (YEK)

On average, flying from Thompson to Arviat generates about 83 kg of CO2 per passenger, and 83 kilograms equals 182 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Thompson to Arviat

See the map of the shortest flight path between Thompson Airport (YTH) and Arviat Airport (YEK).

Airport information

Origin Thompson Airport
City: Thompson
Country: Canada Flag of Canada
IATA Code: YTH
ICAO Code: CYTH
Coordinates: 55°48′3″N, 97°51′51″W
Destination Arviat Airport
City: Arviat
Country: Canada Flag of Canada
IATA Code: YEK
ICAO Code: CYEK
Coordinates: 61°5′39″N, 94°4′14″W