Air Miles Calculator logo

How far is Saibai Island from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Saibai Island (Saibai Island Airport) is 1578 miles / 2540 kilometers / 1371 nautical miles.

Guam Antonio B. Won Pat International Airport – Saibai Island Airport

Distance arrow
1578
Miles
Distance arrow
2540
Kilometers
Distance arrow
1371
Nautical miles

Search flights

Distance from Hagåtña to Saibai Island

There are several ways to calculate the distance from Hagåtña to Saibai Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 1578.061 miles
  • 2539.643 kilometers
  • 1371.297 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1586.598 miles
  • 2553.382 kilometers
  • 1378.716 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Saibai Island?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Saibai Island Airport is 3 hours and 29 minutes.

What is the time difference between Hagåtña and Saibai Island?

There is no time difference between Hagåtña and Saibai Island.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Saibai Island Airport (SBR)

On average, flying from Hagåtña to Saibai Island generates about 184 kg of CO2 per passenger, and 184 kilograms equals 407 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Saibai Island

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Saibai Island Airport (SBR).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Saibai Island Airport
City: Saibai Island
Country: Australia Flag of Australia
IATA Code: SBR
ICAO Code: YSII
Coordinates: 9°22′41″S, 142°37′30″E