Air Miles Calculator logo

How far is Bamaga from Yam Island?

The distance between Yam Island (Yam Island Airport) and Bamaga (Northern Peninsula Airport) is 75 miles / 121 kilometers / 65 nautical miles.

The driving distance from Yam Island (XMY) to Bamaga (ABM) is 25 miles / 41 kilometers, and travel time by car is about 59 minutes.

Yam Island Airport – Northern Peninsula Airport

Distance arrow
75
Miles
Distance arrow
121
Kilometers
Distance arrow
65
Nautical miles

Search flights

Distance from Yam Island to Bamaga

There are several ways to calculate the distance from Yam Island to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 75.300 miles
  • 121.184 kilometers
  • 65.434 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 75.658 miles
  • 121.760 kilometers
  • 65.745 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yam Island to Bamaga?

The estimated flight time from Yam Island Airport to Northern Peninsula Airport is 38 minutes.

What is the time difference between Yam Island and Bamaga?

There is no time difference between Yam Island and Bamaga.

Flight carbon footprint between Yam Island Airport (XMY) and Northern Peninsula Airport (ABM)

On average, flying from Yam Island to Bamaga generates about 36 kg of CO2 per passenger, and 36 kilograms equals 80 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yam Island to Bamaga

See the map of the shortest flight path between Yam Island Airport (XMY) and Northern Peninsula Airport (ABM).

Airport information

Origin Yam Island Airport
City: Yam Island
Country: Australia Flag of Australia
IATA Code: XMY
ICAO Code: YYMI
Coordinates: 9°54′3″S, 142°46′33″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E