Air Miles Calculator logo

How far is Bamaga from Strahan?

The distance between Strahan (Strahan Airport) and Bamaga (Northern Peninsula Airport) is 2155 miles / 3469 kilometers / 1873 nautical miles.

The driving distance from Strahan (SRN) to Bamaga (ABM) is 2839 miles / 4569 kilometers, and travel time by car is about 68 hours 19 minutes.

Strahan Airport – Northern Peninsula Airport

Distance arrow
2155
Miles
Distance arrow
3469
Kilometers
Distance arrow
1873
Nautical miles

Search flights

Distance from Strahan to Bamaga

There are several ways to calculate the distance from Strahan to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 2155.431 miles
  • 3468.830 kilometers
  • 1873.018 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2162.792 miles
  • 3480.677 kilometers
  • 1879.415 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Strahan to Bamaga?

The estimated flight time from Strahan Airport to Northern Peninsula Airport is 4 hours and 34 minutes.

What is the time difference between Strahan and Bamaga?

There is no time difference between Strahan and Bamaga.

Flight carbon footprint between Strahan Airport (SRN) and Northern Peninsula Airport (ABM)

On average, flying from Strahan to Bamaga generates about 235 kg of CO2 per passenger, and 235 kilograms equals 519 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Strahan to Bamaga

See the map of the shortest flight path between Strahan Airport (SRN) and Northern Peninsula Airport (ABM).

Airport information

Origin Strahan Airport
City: Strahan
Country: Australia Flag of Australia
IATA Code: SRN
ICAO Code: YSRN
Coordinates: 42°9′17″S, 145°17′31″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E