Air Miles Calculator logo

How far is Okushiri Island from Yonaguni Jima?

The distance between Yonaguni Jima (Yonaguni Airport) and Okushiri Island (Okushiri Airport) is 1537 miles / 2473 kilometers / 1335 nautical miles.

Yonaguni Airport – Okushiri Airport

Distance arrow
1537
Miles
Distance arrow
2473
Kilometers
Distance arrow
1335
Nautical miles

Search flights

Distance from Yonaguni Jima to Okushiri Island

There are several ways to calculate the distance from Yonaguni Jima to Okushiri Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 1536.615 miles
  • 2472.942 kilometers
  • 1335.282 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1537.831 miles
  • 2474.899 kilometers
  • 1336.338 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yonaguni Jima to Okushiri Island?

The estimated flight time from Yonaguni Airport to Okushiri Airport is 3 hours and 24 minutes.

What is the time difference between Yonaguni Jima and Okushiri Island?

There is no time difference between Yonaguni Jima and Okushiri Island.

Flight carbon footprint between Yonaguni Airport (OGN) and Okushiri Airport (OIR)

On average, flying from Yonaguni Jima to Okushiri Island generates about 182 kg of CO2 per passenger, and 182 kilograms equals 401 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Yonaguni Jima to Okushiri Island

See the map of the shortest flight path between Yonaguni Airport (OGN) and Okushiri Airport (OIR).

Airport information

Origin Yonaguni Airport
City: Yonaguni Jima
Country: Japan Flag of Japan
IATA Code: OGN
ICAO Code: ROYN
Coordinates: 24°28′0″N, 122°58′40″E
Destination Okushiri Airport
City: Okushiri Island
Country: Japan Flag of Japan
IATA Code: OIR
ICAO Code: RJEO
Coordinates: 42°4′18″N, 139°25′58″E