Air Miles Calculator logo

How far is Bathurst from Umiujaq?

The distance between Umiujaq (Umiujaq Airport) and Bathurst (Bathurst Airport (New Brunswick)) is 766 miles / 1233 kilometers / 666 nautical miles.

The driving distance from Umiujaq (YUD) to Bathurst (ZBF) is 1227 miles / 1974 kilometers, and travel time by car is about 31 hours 23 minutes.

Umiujaq Airport – Bathurst Airport (New Brunswick)

Distance arrow
766
Miles
Distance arrow
1233
Kilometers
Distance arrow
666
Nautical miles

Search flights

Distance from Umiujaq to Bathurst

There are several ways to calculate the distance from Umiujaq to Bathurst. Here are two standard methods:

Vincenty's formula (applied above)
  • 766.219 miles
  • 1233.110 kilometers
  • 665.826 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 765.028 miles
  • 1231.192 kilometers
  • 664.791 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Umiujaq to Bathurst?

The estimated flight time from Umiujaq Airport to Bathurst Airport (New Brunswick) is 1 hour and 57 minutes.

Flight carbon footprint between Umiujaq Airport (YUD) and Bathurst Airport (New Brunswick) (ZBF)

On average, flying from Umiujaq to Bathurst generates about 132 kg of CO2 per passenger, and 132 kilograms equals 290 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Umiujaq to Bathurst

See the map of the shortest flight path between Umiujaq Airport (YUD) and Bathurst Airport (New Brunswick) (ZBF).

Airport information

Origin Umiujaq Airport
City: Umiujaq
Country: Canada Flag of Canada
IATA Code: YUD
ICAO Code: CYMU
Coordinates: 56°32′9″N, 76°31′5″W
Destination Bathurst Airport (New Brunswick)
City: Bathurst
Country: Canada Flag of Canada
IATA Code: ZBF
ICAO Code: CZBF
Coordinates: 47°37′46″N, 65°44′20″W