Air Miles Calculator logo

How far is London from Tuktoyaktuk?

The distance between Tuktoyaktuk (Tuktoyaktuk/James Gruben Airport) and London (London International Airport) is 2570 miles / 4136 kilometers / 2233 nautical miles.

The driving distance from Tuktoyaktuk (YUB) to London (YXU) is 4067 miles / 6545 kilometers, and travel time by car is about 91 hours 17 minutes.

Tuktoyaktuk/James Gruben Airport – London International Airport

Distance arrow
2570
Miles
Distance arrow
4136
Kilometers
Distance arrow
2233
Nautical miles

Search flights

Distance from Tuktoyaktuk to London

There are several ways to calculate the distance from Tuktoyaktuk to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2569.863 miles
  • 4135.793 kilometers
  • 2233.150 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2563.964 miles
  • 4126.300 kilometers
  • 2228.024 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tuktoyaktuk to London?

The estimated flight time from Tuktoyaktuk/James Gruben Airport to London International Airport is 5 hours and 21 minutes.

Flight carbon footprint between Tuktoyaktuk/James Gruben Airport (YUB) and London International Airport (YXU)

On average, flying from Tuktoyaktuk to London generates about 283 kg of CO2 per passenger, and 283 kilograms equals 625 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tuktoyaktuk to London

See the map of the shortest flight path between Tuktoyaktuk/James Gruben Airport (YUB) and London International Airport (YXU).

Airport information

Origin Tuktoyaktuk/James Gruben Airport
City: Tuktoyaktuk
Country: Canada Flag of Canada
IATA Code: YUB
ICAO Code: CYUB
Coordinates: 69°25′59″N, 133°1′33″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W