Air Miles Calculator logo

How far is London from Sanikiluaq?

The distance between Sanikiluaq (Sanikiluaq Airport) and London (London International Airport) is 937 miles / 1508 kilometers / 814 nautical miles.

The driving distance from Sanikiluaq (YSK) to London (YXU) is 1112 miles / 1789 kilometers, and travel time by car is about 24 hours 44 minutes.

Sanikiluaq Airport – London International Airport

Distance arrow
937
Miles
Distance arrow
1508
Kilometers
Distance arrow
814
Nautical miles

Search flights

Distance from Sanikiluaq to London

There are several ways to calculate the distance from Sanikiluaq to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 936.957 miles
  • 1507.886 kilometers
  • 814.193 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 936.691 miles
  • 1507.458 kilometers
  • 813.962 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sanikiluaq to London?

The estimated flight time from Sanikiluaq Airport to London International Airport is 2 hours and 16 minutes.

What is the time difference between Sanikiluaq and London?

There is no time difference between Sanikiluaq and London.

Flight carbon footprint between Sanikiluaq Airport (YSK) and London International Airport (YXU)

On average, flying from Sanikiluaq to London generates about 146 kg of CO2 per passenger, and 146 kilograms equals 323 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sanikiluaq to London

See the map of the shortest flight path between Sanikiluaq Airport (YSK) and London International Airport (YXU).

Airport information

Origin Sanikiluaq Airport
City: Sanikiluaq
Country: Canada Flag of Canada
IATA Code: YSK
ICAO Code: CYSK
Coordinates: 56°32′16″N, 79°14′48″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W