Air Miles Calculator logo

How far is Nakashibetsu from Batagay?

The distance between Batagay (Batagay Airport) and Nakashibetsu (Nakashibetsu Airport) is 1708 miles / 2748 kilometers / 1484 nautical miles.

The driving distance from Batagay (BQJ) to Nakashibetsu (SHB) is 4827 miles / 7768 kilometers, and travel time by car is about 122 hours 28 minutes.

Batagay Airport – Nakashibetsu Airport

Distance arrow
1708
Miles
Distance arrow
2748
Kilometers
Distance arrow
1484
Nautical miles

Search flights

Distance from Batagay to Nakashibetsu

There are several ways to calculate the distance from Batagay to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 1707.640 miles
  • 2748.180 kilometers
  • 1483.898 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1705.409 miles
  • 2744.590 kilometers
  • 1481.960 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Batagay to Nakashibetsu?

The estimated flight time from Batagay Airport to Nakashibetsu Airport is 3 hours and 43 minutes.

Flight carbon footprint between Batagay Airport (BQJ) and Nakashibetsu Airport (SHB)

On average, flying from Batagay to Nakashibetsu generates about 193 kg of CO2 per passenger, and 193 kilograms equals 426 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Batagay to Nakashibetsu

See the map of the shortest flight path between Batagay Airport (BQJ) and Nakashibetsu Airport (SHB).

Airport information

Origin Batagay Airport
City: Batagay
Country: Russia Flag of Russia
IATA Code: BQJ
ICAO Code: UEBB
Coordinates: 67°38′52″N, 134°41′42″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E