Air Miles Calculator logo

How far is Bamaga from Hagåtña?

The distance between Hagåtña (Guam Antonio B. Won Pat International Airport) and Bamaga (Northern Peninsula Airport) is 1687 miles / 2715 kilometers / 1466 nautical miles.

Guam Antonio B. Won Pat International Airport – Northern Peninsula Airport

Distance arrow
1687
Miles
Distance arrow
2715
Kilometers
Distance arrow
1466
Nautical miles

Search flights

Distance from Hagåtña to Bamaga

There are several ways to calculate the distance from Hagåtña to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1686.721 miles
  • 2714.514 kilometers
  • 1465.720 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1695.826 miles
  • 2729.168 kilometers
  • 1473.633 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hagåtña to Bamaga?

The estimated flight time from Guam Antonio B. Won Pat International Airport to Northern Peninsula Airport is 3 hours and 41 minutes.

What is the time difference between Hagåtña and Bamaga?

There is no time difference between Hagåtña and Bamaga.

Flight carbon footprint between Guam Antonio B. Won Pat International Airport (GUM) and Northern Peninsula Airport (ABM)

On average, flying from Hagåtña to Bamaga generates about 192 kg of CO2 per passenger, and 192 kilograms equals 423 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hagåtña to Bamaga

See the map of the shortest flight path between Guam Antonio B. Won Pat International Airport (GUM) and Northern Peninsula Airport (ABM).

Airport information

Origin Guam Antonio B. Won Pat International Airport
City: Hagåtña
Country: Guam Flag of Guam
IATA Code: GUM
ICAO Code: PGUM
Coordinates: 13°29′0″N, 144°47′45″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E