Air Miles Calculator logo

How far is Toronto from Inukjuak?

The distance between Inukjuak (Inukjuak Airport) and Toronto (Toronto Pearson International Airport) is 1025 miles / 1649 kilometers / 891 nautical miles.

The driving distance from Inukjuak (YPH) to Toronto (YYZ) is 1006 miles / 1619 kilometers, and travel time by car is about 22 hours 30 minutes.

Inukjuak Airport – Toronto Pearson International Airport

Distance arrow
1025
Miles
Distance arrow
1649
Kilometers
Distance arrow
891
Nautical miles

Search flights

Distance from Inukjuak to Toronto

There are several ways to calculate the distance from Inukjuak to Toronto. Here are two standard methods:

Vincenty's formula (applied above)
  • 1024.873 miles
  • 1649.373 kilometers
  • 890.590 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1024.370 miles
  • 1648.563 kilometers
  • 890.153 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Inukjuak to Toronto?

The estimated flight time from Inukjuak Airport to Toronto Pearson International Airport is 2 hours and 26 minutes.

What is the time difference between Inukjuak and Toronto?

There is no time difference between Inukjuak and Toronto.

Flight carbon footprint between Inukjuak Airport (YPH) and Toronto Pearson International Airport (YYZ)

On average, flying from Inukjuak to Toronto generates about 152 kg of CO2 per passenger, and 152 kilograms equals 336 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Inukjuak to Toronto

See the map of the shortest flight path between Inukjuak Airport (YPH) and Toronto Pearson International Airport (YYZ).

Airport information

Origin Inukjuak Airport
City: Inukjuak
Country: Canada Flag of Canada
IATA Code: YPH
ICAO Code: CYPH
Coordinates: 58°28′18″N, 78°4′36″W
Destination Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W