Air Miles Calculator logo

How far is Barrow, AK, from St. George Island, AK?

The distance between St. George Island (St. George Airport) and Barrow (Wiley Post–Will Rogers Memorial Airport) is 1086 miles / 1748 kilometers / 944 nautical miles.

St. George Airport – Wiley Post–Will Rogers Memorial Airport

Distance arrow
1086
Miles
Distance arrow
1748
Kilometers
Distance arrow
944
Nautical miles

Search flights

Distance from St. George Island to Barrow

There are several ways to calculate the distance from St. George Island to Barrow. Here are two standard methods:

Vincenty's formula (applied above)
  • 1086.449 miles
  • 1748.470 kilometers
  • 944.098 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1083.576 miles
  • 1743.846 kilometers
  • 941.602 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. George Island to Barrow?

The estimated flight time from St. George Airport to Wiley Post–Will Rogers Memorial Airport is 2 hours and 33 minutes.

Flight carbon footprint between St. George Airport (STG) and Wiley Post–Will Rogers Memorial Airport (BRW)

On average, flying from St. George Island to Barrow generates about 156 kg of CO2 per passenger, and 156 kilograms equals 344 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from St. George Island to Barrow

See the map of the shortest flight path between St. George Airport (STG) and Wiley Post–Will Rogers Memorial Airport (BRW).

Airport information

Origin St. George Airport
City: St. George Island, AK
Country: United States Flag of United States
IATA Code: STG
ICAO Code: PAPB
Coordinates: 56°34′38″N, 169°39′49″W
Destination Wiley Post–Will Rogers Memorial Airport
City: Barrow, AK
Country: United States Flag of United States
IATA Code: BRW
ICAO Code: PABR
Coordinates: 71°17′7″N, 156°45′57″W