Air Miles Calculator logo

How far is London from Cape Dorset?

The distance between Cape Dorset (Cape Dorset Airport) and London (London International Airport) is 1477 miles / 2377 kilometers / 1284 nautical miles.

The driving distance from Cape Dorset (YTE) to London (YXU) is 1373 miles / 2209 kilometers, and travel time by car is about 37 hours 32 minutes.

Cape Dorset Airport – London International Airport

Distance arrow
1477
Miles
Distance arrow
2377
Kilometers
Distance arrow
1284
Nautical miles

Search flights

Distance from Cape Dorset to London

There are several ways to calculate the distance from Cape Dorset to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 1477.065 miles
  • 2377.106 kilometers
  • 1283.535 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1475.694 miles
  • 2374.899 kilometers
  • 1282.343 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Cape Dorset to London?

The estimated flight time from Cape Dorset Airport to London International Airport is 3 hours and 17 minutes.

What is the time difference between Cape Dorset and London?

There is no time difference between Cape Dorset and London.

Flight carbon footprint between Cape Dorset Airport (YTE) and London International Airport (YXU)

On average, flying from Cape Dorset to London generates about 178 kg of CO2 per passenger, and 178 kilograms equals 392 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Cape Dorset to London

See the map of the shortest flight path between Cape Dorset Airport (YTE) and London International Airport (YXU).

Airport information

Origin Cape Dorset Airport
City: Cape Dorset
Country: Canada Flag of Canada
IATA Code: YTE
ICAO Code: CYTE
Coordinates: 64°13′48″N, 76°31′36″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W