Air Miles Calculator logo

How far is Bamaga from Moruya?

The distance between Moruya (Moruya Airport) and Bamaga (Northern Peninsula Airport) is 1783 miles / 2870 kilometers / 1550 nautical miles.

The driving distance from Moruya (MYA) to Bamaga (ABM) is 2234 miles / 3595 kilometers, and travel time by car is about 53 hours 11 minutes.

Moruya Airport – Northern Peninsula Airport

Distance arrow
1783
Miles
Distance arrow
2870
Kilometers
Distance arrow
1550
Nautical miles

Search flights

Distance from Moruya to Bamaga

There are several ways to calculate the distance from Moruya to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 1783.296 miles
  • 2869.936 kilometers
  • 1549.641 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1789.545 miles
  • 2879.993 kilometers
  • 1555.072 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Moruya to Bamaga?

The estimated flight time from Moruya Airport to Northern Peninsula Airport is 3 hours and 52 minutes.

What is the time difference between Moruya and Bamaga?

There is no time difference between Moruya and Bamaga.

Flight carbon footprint between Moruya Airport (MYA) and Northern Peninsula Airport (ABM)

On average, flying from Moruya to Bamaga generates about 199 kg of CO2 per passenger, and 199 kilograms equals 438 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Moruya to Bamaga

See the map of the shortest flight path between Moruya Airport (MYA) and Northern Peninsula Airport (ABM).

Airport information

Origin Moruya Airport
City: Moruya
Country: Australia Flag of Australia
IATA Code: MYA
ICAO Code: YMRY
Coordinates: 35°53′52″S, 150°8′38″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E