Air Miles Calculator logo

How far is London from Nakina?

The distance between Nakina (Nakina Airport) and London (London International Airport) is 559 miles / 900 kilometers / 486 nautical miles.

The driving distance from Nakina (YQN) to London (YXU) is 859 miles / 1383 kilometers, and travel time by car is about 18 hours 38 minutes.

Nakina Airport – London International Airport

Distance arrow
559
Miles
Distance arrow
900
Kilometers
Distance arrow
486
Nautical miles

Search flights

Distance from Nakina to London

There are several ways to calculate the distance from Nakina to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 559.383 miles
  • 900.240 kilometers
  • 486.091 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 559.153 miles
  • 899.869 kilometers
  • 485.890 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nakina to London?

The estimated flight time from Nakina Airport to London International Airport is 1 hour and 33 minutes.

What is the time difference between Nakina and London?

There is no time difference between Nakina and London.

Flight carbon footprint between Nakina Airport (YQN) and London International Airport (YXU)

On average, flying from Nakina to London generates about 107 kg of CO2 per passenger, and 107 kilograms equals 237 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nakina to London

See the map of the shortest flight path between Nakina Airport (YQN) and London International Airport (YXU).

Airport information

Origin Nakina Airport
City: Nakina
Country: Canada Flag of Canada
IATA Code: YQN
ICAO Code: CYQN
Coordinates: 50°10′58″N, 86°41′47″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W