Air Miles Calculator logo

How far is Nakashibetsu from Zyryanka?

The distance between Zyryanka (Zyryanka Airport) and Nakashibetsu (Nakashibetsu Airport) is 1548 miles / 2492 kilometers / 1346 nautical miles.

The driving distance from Zyryanka (ZKP) to Nakashibetsu (SHB) is 5002 miles / 8050 kilometers, and travel time by car is about 124 hours 47 minutes.

Zyryanka Airport – Nakashibetsu Airport

Distance arrow
1548
Miles
Distance arrow
2492
Kilometers
Distance arrow
1346
Nautical miles

Search flights

Distance from Zyryanka to Nakashibetsu

There are several ways to calculate the distance from Zyryanka to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 1548.379 miles
  • 2491.874 kilometers
  • 1345.504 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1546.673 miles
  • 2489.129 kilometers
  • 1344.022 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Zyryanka to Nakashibetsu?

The estimated flight time from Zyryanka Airport to Nakashibetsu Airport is 3 hours and 25 minutes.

Flight carbon footprint between Zyryanka Airport (ZKP) and Nakashibetsu Airport (SHB)

On average, flying from Zyryanka to Nakashibetsu generates about 183 kg of CO2 per passenger, and 183 kilograms equals 402 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Zyryanka to Nakashibetsu

See the map of the shortest flight path between Zyryanka Airport (ZKP) and Nakashibetsu Airport (SHB).

Airport information

Origin Zyryanka Airport
City: Zyryanka
Country: Russia Flag of Russia
IATA Code: ZKP
ICAO Code: UESU
Coordinates: 65°44′12″N, 150°42′18″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E