Air Miles Calculator logo

How far is Nakashibetsu from Yonaguni Jima?

The distance between Yonaguni Jima (Yonaguni Airport) and Nakashibetsu (Nakashibetsu Airport) is 1813 miles / 2917 kilometers / 1575 nautical miles.

Yonaguni Airport – Nakashibetsu Airport

Distance arrow
1813
Miles
Distance arrow
2917
Kilometers
Distance arrow
1575
Nautical miles

Search flights

Distance from Yonaguni Jima to Nakashibetsu

There are several ways to calculate the distance from Yonaguni Jima to Nakashibetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 1812.518 miles
  • 2916.965 kilometers
  • 1575.035 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1812.995 miles
  • 2917.732 kilometers
  • 1575.449 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yonaguni Jima to Nakashibetsu?

The estimated flight time from Yonaguni Airport to Nakashibetsu Airport is 3 hours and 55 minutes.

What is the time difference between Yonaguni Jima and Nakashibetsu?

There is no time difference between Yonaguni Jima and Nakashibetsu.

Flight carbon footprint between Yonaguni Airport (OGN) and Nakashibetsu Airport (SHB)

On average, flying from Yonaguni Jima to Nakashibetsu generates about 201 kg of CO2 per passenger, and 201 kilograms equals 443 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Yonaguni Jima to Nakashibetsu

See the map of the shortest flight path between Yonaguni Airport (OGN) and Nakashibetsu Airport (SHB).

Airport information

Origin Yonaguni Airport
City: Yonaguni Jima
Country: Japan Flag of Japan
IATA Code: OGN
ICAO Code: ROYN
Coordinates: 24°28′0″N, 122°58′40″E
Destination Nakashibetsu Airport
City: Nakashibetsu
Country: Japan Flag of Japan
IATA Code: SHB
ICAO Code: RJCN
Coordinates: 43°34′38″N, 144°57′36″E