Air Miles Calculator logo

How far is Bathurst from Ottawa?

The distance between Ottawa (Ottawa Macdonald–Cartier International Airport) and Bathurst (Bathurst Airport) is 9916 miles / 15958 kilometers / 8616 nautical miles.

Ottawa Macdonald–Cartier International Airport – Bathurst Airport

Distance arrow
9916
Miles
Distance arrow
15958
Kilometers
Distance arrow
8616
Nautical miles
Flight time duration
19 h 16 min
CO2 emission
1 288 kg

Search flights

Distance from Ottawa to Bathurst

There are several ways to calculate the distance from Ottawa to Bathurst. Here are two standard methods:

Vincenty's formula (applied above)
  • 9915.575 miles
  • 15957.571 kilometers
  • 8616.399 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9917.328 miles
  • 15960.392 kilometers
  • 8617.922 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ottawa to Bathurst?

The estimated flight time from Ottawa Macdonald–Cartier International Airport to Bathurst Airport is 19 hours and 16 minutes.

Flight carbon footprint between Ottawa Macdonald–Cartier International Airport (YOW) and Bathurst Airport (BHS)

On average, flying from Ottawa to Bathurst generates about 1 288 kg of CO2 per passenger, and 1 288 kilograms equals 2 840 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ottawa to Bathurst

See the map of the shortest flight path between Ottawa Macdonald–Cartier International Airport (YOW) and Bathurst Airport (BHS).

Airport information

Origin Ottawa Macdonald–Cartier International Airport
City: Ottawa
Country: Canada Flag of Canada
IATA Code: YOW
ICAO Code: CYOW
Coordinates: 45°19′20″N, 75°40′9″W
Destination Bathurst Airport
City: Bathurst
Country: Australia Flag of Australia
IATA Code: BHS
ICAO Code: YBTH
Coordinates: 33°24′33″S, 149°39′7″E