Air Miles Calculator logo

How far is Prince Rupert from Umiujaq?

The distance between Umiujaq (Umiujaq Airport) and Prince Rupert (Prince Rupert Airport) is 2073 miles / 3336 kilometers / 1801 nautical miles.

The driving distance from Umiujaq (YUD) to Prince Rupert (YPR) is 3330 miles / 5359 kilometers, and travel time by car is about 72 hours 50 minutes.

Umiujaq Airport – Prince Rupert Airport

Distance arrow
2073
Miles
Distance arrow
3336
Kilometers
Distance arrow
1801
Nautical miles

Search flights

Distance from Umiujaq to Prince Rupert

There are several ways to calculate the distance from Umiujaq to Prince Rupert. Here are two standard methods:

Vincenty's formula (applied above)
  • 2073.029 miles
  • 3336.217 kilometers
  • 1801.413 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2066.039 miles
  • 3324.968 kilometers
  • 1795.339 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Umiujaq to Prince Rupert?

The estimated flight time from Umiujaq Airport to Prince Rupert Airport is 4 hours and 25 minutes.

Flight carbon footprint between Umiujaq Airport (YUD) and Prince Rupert Airport (YPR)

On average, flying from Umiujaq to Prince Rupert generates about 226 kg of CO2 per passenger, and 226 kilograms equals 498 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Umiujaq to Prince Rupert

See the map of the shortest flight path between Umiujaq Airport (YUD) and Prince Rupert Airport (YPR).

Airport information

Origin Umiujaq Airport
City: Umiujaq
Country: Canada Flag of Canada
IATA Code: YUD
ICAO Code: CYMU
Coordinates: 56°32′9″N, 76°31′5″W
Destination Prince Rupert Airport
City: Prince Rupert
Country: Canada Flag of Canada
IATA Code: YPR
ICAO Code: CYPR
Coordinates: 54°17′9″N, 130°26′42″W