Air Miles Calculator logo

How far is Beatrice, NE, from Arctic Bay?

The distance between Arctic Bay (Arctic Bay Airport) and Beatrice (Beatrice Municipal Airport) is 2297 miles / 3696 kilometers / 1996 nautical miles.

Arctic Bay Airport – Beatrice Municipal Airport

Distance arrow
2297
Miles
Distance arrow
3696
Kilometers
Distance arrow
1996
Nautical miles

Search flights

Distance from Arctic Bay to Beatrice

There are several ways to calculate the distance from Arctic Bay to Beatrice. Here are two standard methods:

Vincenty's formula (applied above)
  • 2296.603 miles
  • 3696.024 kilometers
  • 1995.693 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2293.440 miles
  • 3690.934 kilometers
  • 1992.945 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Arctic Bay to Beatrice?

The estimated flight time from Arctic Bay Airport to Beatrice Municipal Airport is 4 hours and 50 minutes.

What is the time difference between Arctic Bay and Beatrice?

There is no time difference between Arctic Bay and Beatrice.

Flight carbon footprint between Arctic Bay Airport (YAB) and Beatrice Municipal Airport (BIE)

On average, flying from Arctic Bay to Beatrice generates about 252 kg of CO2 per passenger, and 252 kilograms equals 555 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Arctic Bay to Beatrice

See the map of the shortest flight path between Arctic Bay Airport (YAB) and Beatrice Municipal Airport (BIE).

Airport information

Origin Arctic Bay Airport
City: Arctic Bay
Country: Canada Flag of Canada
IATA Code: YAB
ICAO Code: CYAB
Coordinates: 73°0′20″N, 85°2′33″W
Destination Beatrice Municipal Airport
City: Beatrice, NE
Country: United States Flag of United States
IATA Code: BIE
ICAO Code: KBIE
Coordinates: 40°18′4″N, 96°45′14″W