Air Miles Calculator logo

How far is Bamaga from Windorah?

The distance between Windorah (Windorah Airport) and Bamaga (Northern Peninsula Airport) is 995 miles / 1601 kilometers / 864 nautical miles.

The driving distance from Windorah (WNR) to Bamaga (ABM) is 1363 miles / 2194 kilometers, and travel time by car is about 36 hours 10 minutes.

Windorah Airport – Northern Peninsula Airport

Distance arrow
995
Miles
Distance arrow
1601
Kilometers
Distance arrow
864
Nautical miles

Search flights

Distance from Windorah to Bamaga

There are several ways to calculate the distance from Windorah to Bamaga. Here are two standard methods:

Vincenty's formula (applied above)
  • 994.779 miles
  • 1600.942 kilometers
  • 864.440 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 999.342 miles
  • 1608.286 kilometers
  • 868.405 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Windorah to Bamaga?

The estimated flight time from Windorah Airport to Northern Peninsula Airport is 2 hours and 23 minutes.

What is the time difference between Windorah and Bamaga?

There is no time difference between Windorah and Bamaga.

Flight carbon footprint between Windorah Airport (WNR) and Northern Peninsula Airport (ABM)

On average, flying from Windorah to Bamaga generates about 150 kg of CO2 per passenger, and 150 kilograms equals 332 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Windorah to Bamaga

See the map of the shortest flight path between Windorah Airport (WNR) and Northern Peninsula Airport (ABM).

Airport information

Origin Windorah Airport
City: Windorah
Country: Australia Flag of Australia
IATA Code: WNR
ICAO Code: YWDH
Coordinates: 25°24′47″S, 142°40′1″E
Destination Northern Peninsula Airport
City: Bamaga
Country: Australia Flag of Australia
IATA Code: ABM
ICAO Code: YBAM
Coordinates: 10°57′2″S, 142°27′32″E