Air Miles Calculator logo

How far is Bathurst from São Paulo?

The distance between São Paulo (São Paulo–Guarulhos International Airport) and Bathurst (Bathurst Airport) is 8380 miles / 13487 kilometers / 7282 nautical miles.

São Paulo–Guarulhos International Airport – Bathurst Airport

Distance arrow
8380
Miles
Distance arrow
13487
Kilometers
Distance arrow
7282
Nautical miles
Flight time duration
16 h 22 min
CO2 emission
1 054 kg

Search flights

Distance from São Paulo to Bathurst

There are several ways to calculate the distance from São Paulo to Bathurst. Here are two standard methods:

Vincenty's formula (applied above)
  • 8380.248 miles
  • 13486.702 kilometers
  • 7282.236 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8368.328 miles
  • 13467.518 kilometers
  • 7271.878 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from São Paulo to Bathurst?

The estimated flight time from São Paulo–Guarulhos International Airport to Bathurst Airport is 16 hours and 22 minutes.

Flight carbon footprint between São Paulo–Guarulhos International Airport (GRU) and Bathurst Airport (BHS)

On average, flying from São Paulo to Bathurst generates about 1 054 kg of CO2 per passenger, and 1 054 kilograms equals 2 325 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from São Paulo to Bathurst

See the map of the shortest flight path between São Paulo–Guarulhos International Airport (GRU) and Bathurst Airport (BHS).

Airport information

Origin São Paulo–Guarulhos International Airport
City: São Paulo
Country: Brazil Flag of Brazil
IATA Code: GRU
ICAO Code: SBGR
Coordinates: 23°26′8″S, 46°28′23″W
Destination Bathurst Airport
City: Bathurst
Country: Australia Flag of Australia
IATA Code: BHS
ICAO Code: YBTH
Coordinates: 33°24′33″S, 149°39′7″E