Air Miles Calculator logo

How far is Newcastle from Saibai Island?

The distance between Saibai Island (Saibai Island Airport) and Newcastle (Newcastle Airport) is 1715 miles / 2760 kilometers / 1491 nautical miles.

The driving distance from Saibai Island (SBR) to Newcastle (NTL) is 2054 miles / 3305 kilometers, and travel time by car is about 49 hours 21 minutes.

Saibai Island Airport – Newcastle Airport

Distance arrow
1715
Miles
Distance arrow
2760
Kilometers
Distance arrow
1491
Nautical miles

Search flights

Distance from Saibai Island to Newcastle

There are several ways to calculate the distance from Saibai Island to Newcastle. Here are two standard methods:

Vincenty's formula (applied above)
  • 1715.269 miles
  • 2760.458 kilometers
  • 1490.528 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1721.316 miles
  • 2770.190 kilometers
  • 1495.783 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Saibai Island to Newcastle?

The estimated flight time from Saibai Island Airport to Newcastle Airport is 3 hours and 44 minutes.

What is the time difference between Saibai Island and Newcastle?

There is no time difference between Saibai Island and Newcastle.

Flight carbon footprint between Saibai Island Airport (SBR) and Newcastle Airport (NTL)

On average, flying from Saibai Island to Newcastle generates about 194 kg of CO2 per passenger, and 194 kilograms equals 427 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Saibai Island to Newcastle

See the map of the shortest flight path between Saibai Island Airport (SBR) and Newcastle Airport (NTL).

Airport information

Origin Saibai Island Airport
City: Saibai Island
Country: Australia Flag of Australia
IATA Code: SBR
ICAO Code: YSII
Coordinates: 9°22′41″S, 142°37′30″E
Destination Newcastle Airport
City: Newcastle
Country: Australia Flag of Australia
IATA Code: NTL
ICAO Code: YWLM
Coordinates: 32°47′41″S, 151°50′2″E