Air Miles Calculator logo

How far is London from Wemindji?

The distance between Wemindji (Wemindji Airport) and London (London International Airport) is 697 miles / 1122 kilometers / 606 nautical miles.

The driving distance from Wemindji (YNC) to London (YXU) is 1051 miles / 1692 kilometers, and travel time by car is about 24 hours 4 minutes.

Wemindji Airport – London International Airport

Distance arrow
697
Miles
Distance arrow
1122
Kilometers
Distance arrow
606
Nautical miles

Search flights

Distance from Wemindji to London

There are several ways to calculate the distance from Wemindji to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 697.437 miles
  • 1122.416 kilometers
  • 606.056 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 697.417 miles
  • 1122.384 kilometers
  • 606.039 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wemindji to London?

The estimated flight time from Wemindji Airport to London International Airport is 1 hour and 49 minutes.

What is the time difference between Wemindji and London?

There is no time difference between Wemindji and London.

Flight carbon footprint between Wemindji Airport (YNC) and London International Airport (YXU)

On average, flying from Wemindji to London generates about 124 kg of CO2 per passenger, and 124 kilograms equals 274 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wemindji to London

See the map of the shortest flight path between Wemindji Airport (YNC) and London International Airport (YXU).

Airport information

Origin Wemindji Airport
City: Wemindji
Country: Canada Flag of Canada
IATA Code: YNC
ICAO Code: CYNC
Coordinates: 53°0′38″N, 78°49′51″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W