Air Miles Calculator logo

How far is Albury from Strahan?

The distance between Strahan (Strahan Airport) and Albury (Albury Airport) is 429 miles / 691 kilometers / 373 nautical miles.

The driving distance from Strahan (SRN) to Albury (ABX) is 615 miles / 990 kilometers, and travel time by car is about 15 hours 58 minutes.

Strahan Airport – Albury Airport

Distance arrow
429
Miles
Distance arrow
691
Kilometers
Distance arrow
373
Nautical miles

Search flights

Distance from Strahan to Albury

There are several ways to calculate the distance from Strahan to Albury. Here are two standard methods:

Vincenty's formula (applied above)
  • 429.328 miles
  • 690.936 kilometers
  • 373.076 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 429.938 miles
  • 691.918 kilometers
  • 373.606 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Strahan to Albury?

The estimated flight time from Strahan Airport to Albury Airport is 1 hour and 18 minutes.

What is the time difference between Strahan and Albury?

There is no time difference between Strahan and Albury.

Flight carbon footprint between Strahan Airport (SRN) and Albury Airport (ABX)

On average, flying from Strahan to Albury generates about 88 kg of CO2 per passenger, and 88 kilograms equals 194 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Strahan to Albury

See the map of the shortest flight path between Strahan Airport (SRN) and Albury Airport (ABX).

Airport information

Origin Strahan Airport
City: Strahan
Country: Australia Flag of Australia
IATA Code: SRN
ICAO Code: YSRN
Coordinates: 42°9′17″S, 145°17′31″E
Destination Albury Airport
City: Albury
Country: Australia Flag of Australia
IATA Code: ABX
ICAO Code: YMAY
Coordinates: 36°4′4″S, 146°57′28″E