Air Miles Calculator logo

How far is Bamaga from Maningrida?

The distance between Maningrida (Maningrida Airport) and Bamaga (Northern Peninsula Airport) is 563 miles / 906 kilometers / 489 nautical miles.

The driving distance from Maningrida (MNG) to Bamaga (ABM) is 2098 miles / 3377 kilometers, and travel time by car is about 54 hours 23 minutes.

Maningrida Airport – Northern Peninsula Airport

Distance arrow
563
Miles
Distance arrow
906
Kilometers
Distance arrow
489
Nautical miles
Flight time duration
1 h 33 min
CO2 emission
108 kg

Search flights

Distance from Maningrida to Bamaga

There are several ways to calculate the distance from Maningrida to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 562.699 miles
  • 905.577 kilometers
  • 488.972 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 562.062 miles
  • 904.551 kilometers
  • 488.418 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Maningrida to Bamaga?

The estimated flight time from Maningrida Airport to Northern Peninsula Airport is 1 hour and 33 minutes.

Flight carbon footprint between Maningrida Airport (MNG) and Northern Peninsula Airport (ABM)

On average, flying from Maningrida to Bamaga generates about 108 kg of CO2 per passenger, and 108 kilograms equals 238 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Maningrida to Bamaga

See the map of the shortest flight path between Maningrida Airport (MNG) and Northern Peninsula Airport (ABM).

Airport information

Origin Maningrida Airport
City: Maningrida
Country: Australia Flag of Australia
IATA Code: MNG
ICAO Code: YMGD
Coordinates: 12°3′21″S, 134°14′2″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E