Air Miles Calculator logo

How far is Rigolet from Wabush?

The distance between Wabush (Wabush Airport) and Rigolet (Rigolet Airport) is 357 miles / 574 kilometers / 310 nautical miles.

The driving distance from Wabush (YWK) to Rigolet (YRG) is 439 miles / 707 kilometers, and travel time by car is about 13 hours 13 minutes.

Wabush Airport – Rigolet Airport

Distance arrow
357
Miles
Distance arrow
574
Kilometers
Distance arrow
310
Nautical miles

Search flights

Distance from Wabush to Rigolet

There are several ways to calculate the distance from Wabush to Rigolet. Here are two standard methods:

Vincenty's formula (applied above)
  • 356.753 miles
  • 574.138 kilometers
  • 310.010 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 355.632 miles
  • 572.334 kilometers
  • 309.036 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wabush to Rigolet?

The estimated flight time from Wabush Airport to Rigolet Airport is 1 hour and 10 minutes.

What is the time difference between Wabush and Rigolet?

There is no time difference between Wabush and Rigolet.

Flight carbon footprint between Wabush Airport (YWK) and Rigolet Airport (YRG)

On average, flying from Wabush to Rigolet generates about 78 kg of CO2 per passenger, and 78 kilograms equals 171 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wabush to Rigolet

See the map of the shortest flight path between Wabush Airport (YWK) and Rigolet Airport (YRG).

Airport information

Origin Wabush Airport
City: Wabush
Country: Canada Flag of Canada
IATA Code: YWK
ICAO Code: CYWK
Coordinates: 52°55′18″N, 66°51′51″W
Destination Rigolet Airport
City: Rigolet
Country: Canada Flag of Canada
IATA Code: YRG
ICAO Code: CCZ2
Coordinates: 54°10′46″N, 58°27′27″W