Air Miles Calculator logo

How far is Iqaluit from Umiujaq?

The distance between Umiujaq (Umiujaq Airport) and Iqaluit (Iqaluit Airport) is 569 miles / 916 kilometers / 495 nautical miles.

The driving distance from Umiujaq (YUD) to Iqaluit (YFB) is 2309 miles / 3716 kilometers, and travel time by car is about 79 hours 21 minutes.

Umiujaq Airport – Iqaluit Airport

Distance arrow
569
Miles
Distance arrow
916
Kilometers
Distance arrow
495
Nautical miles

Search flights

Distance from Umiujaq to Iqaluit

There are several ways to calculate the distance from Umiujaq to Iqaluit. Here are two standard methods:

Vincenty's formula (applied above)
  • 569.476 miles
  • 916.482 kilometers
  • 494.861 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 568.139 miles
  • 914.332 kilometers
  • 493.700 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Umiujaq to Iqaluit?

The estimated flight time from Umiujaq Airport to Iqaluit Airport is 1 hour and 34 minutes.

What is the time difference between Umiujaq and Iqaluit?

There is no time difference between Umiujaq and Iqaluit.

Flight carbon footprint between Umiujaq Airport (YUD) and Iqaluit Airport (YFB)

On average, flying from Umiujaq to Iqaluit generates about 109 kg of CO2 per passenger, and 109 kilograms equals 240 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Umiujaq to Iqaluit

See the map of the shortest flight path between Umiujaq Airport (YUD) and Iqaluit Airport (YFB).

Airport information

Origin Umiujaq Airport
City: Umiujaq
Country: Canada Flag of Canada
IATA Code: YUD
ICAO Code: CYMU
Coordinates: 56°32′9″N, 76°31′5″W
Destination Iqaluit Airport
City: Iqaluit
Country: Canada Flag of Canada
IATA Code: YFB
ICAO Code: CYFB
Coordinates: 63°45′23″N, 68°33′20″W