Air Miles Calculator logo

How far is Inukjuak from Nakina?

The distance between Nakina (Nakina Airport) and Inukjuak (Inukjuak Airport) is 670 miles / 1078 kilometers / 582 nautical miles.

The driving distance from Nakina (YQN) to Inukjuak (YPH) is 1005 miles / 1618 kilometers, and travel time by car is about 23 hours 54 minutes.

Nakina Airport – Inukjuak Airport

Distance arrow
670
Miles
Distance arrow
1078
Kilometers
Distance arrow
582
Nautical miles

Search flights

Distance from Nakina to Inukjuak

There are several ways to calculate the distance from Nakina to Inukjuak. Here are two standard methods:

Vincenty's formula (applied above)
  • 669.712 miles
  • 1077.798 kilometers
  • 581.964 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 668.611 miles
  • 1076.024 kilometers
  • 581.007 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nakina to Inukjuak?

The estimated flight time from Nakina Airport to Inukjuak Airport is 1 hour and 46 minutes.

What is the time difference between Nakina and Inukjuak?

There is no time difference between Nakina and Inukjuak.

Flight carbon footprint between Nakina Airport (YQN) and Inukjuak Airport (YPH)

On average, flying from Nakina to Inukjuak generates about 121 kg of CO2 per passenger, and 121 kilograms equals 267 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nakina to Inukjuak

See the map of the shortest flight path between Nakina Airport (YQN) and Inukjuak Airport (YPH).

Airport information

Origin Nakina Airport
City: Nakina
Country: Canada Flag of Canada
IATA Code: YQN
ICAO Code: CYQN
Coordinates: 50°10′58″N, 86°41′47″W
Destination Inukjuak Airport
City: Inukjuak
Country: Canada Flag of Canada
IATA Code: YPH
ICAO Code: CYPH
Coordinates: 58°28′18″N, 78°4′36″W