Air Miles Calculator logo

How far is Sandy Lake from Hibbing, MN?

The distance between Hibbing (Hibbing Range Regional Airport) and Sandy Lake (Sandy Lake Airport) is 393 miles / 633 kilometers / 342 nautical miles.

Hibbing Range Regional Airport – Sandy Lake Airport

Distance arrow
393
Miles
Distance arrow
633
Kilometers
Distance arrow
342
Nautical miles

Search flights

Distance from Hibbing to Sandy Lake

There are several ways to calculate the distance from Hibbing to Sandy Lake. Here are two standard methods:

Vincenty's formula (applied above)
  • 393.056 miles
  • 632.562 kilometers
  • 341.556 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 392.918 miles
  • 632.340 kilometers
  • 341.436 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hibbing to Sandy Lake?

The estimated flight time from Hibbing Range Regional Airport to Sandy Lake Airport is 1 hour and 14 minutes.

What is the time difference between Hibbing and Sandy Lake?

There is no time difference between Hibbing and Sandy Lake.

Flight carbon footprint between Hibbing Range Regional Airport (HIB) and Sandy Lake Airport (ZSJ)

On average, flying from Hibbing to Sandy Lake generates about 83 kg of CO2 per passenger, and 83 kilograms equals 183 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hibbing to Sandy Lake

See the map of the shortest flight path between Hibbing Range Regional Airport (HIB) and Sandy Lake Airport (ZSJ).

Airport information

Origin Hibbing Range Regional Airport
City: Hibbing, MN
Country: United States Flag of United States
IATA Code: HIB
ICAO Code: KHIB
Coordinates: 47°23′11″N, 92°50′20″W
Destination Sandy Lake Airport
City: Sandy Lake
Country: Canada Flag of Canada
IATA Code: ZSJ
ICAO Code: CZSJ
Coordinates: 53°3′51″N, 93°20′39″W