Air Miles Calculator logo

How far is Bathurst from Chatham Island?

The distance between Chatham Island (Chatham Islands / Tuuta Airport) and Bathurst (Bathurst Airport) is 1953 miles / 3143 kilometers / 1697 nautical miles.

Chatham Islands / Tuuta Airport – Bathurst Airport

Distance arrow
1953
Miles
Distance arrow
3143
Kilometers
Distance arrow
1697
Nautical miles
Flight time duration
4 h 11 min
Time Difference
2 h 45 min
CO2 emission
213 kg

Search flights

Distance from Chatham Island to Bathurst

There are several ways to calculate the distance from Chatham Island to Bathurst. Here are two standard methods:

Vincenty's formula (applied above)
  • 1952.745 miles
  • 3142.638 kilometers
  • 1696.889 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1949.110 miles
  • 3136.789 kilometers
  • 1693.731 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chatham Island to Bathurst?

The estimated flight time from Chatham Islands / Tuuta Airport to Bathurst Airport is 4 hours and 11 minutes.

Flight carbon footprint between Chatham Islands / Tuuta Airport (CHT) and Bathurst Airport (BHS)

On average, flying from Chatham Island to Bathurst generates about 213 kg of CO2 per passenger, and 213 kilograms equals 470 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Chatham Island to Bathurst

See the map of the shortest flight path between Chatham Islands / Tuuta Airport (CHT) and Bathurst Airport (BHS).

Airport information

Origin Chatham Islands / Tuuta Airport
City: Chatham Island
Country: New Zealand Flag of New Zealand
IATA Code: CHT
ICAO Code: NZCI
Coordinates: 43°48′36″S, 176°27′25″W
Destination Bathurst Airport
City: Bathurst
Country: Australia Flag of Australia
IATA Code: BHS
ICAO Code: YBTH
Coordinates: 33°24′33″S, 149°39′7″E