Air Miles Calculator logo

How far is Yonago from Shonai?

The distance between Shonai (Shonai Airport) and Yonago (Miho-Yonago Airport) is 428 miles / 689 kilometers / 372 nautical miles.

The driving distance from Shonai (SYO) to Yonago (YGJ) is 567 miles / 912 kilometers, and travel time by car is about 11 hours 3 minutes.

Shonai Airport – Miho-Yonago Airport

Distance arrow
428
Miles
Distance arrow
689
Kilometers
Distance arrow
372
Nautical miles

Search flights

Distance from Shonai to Yonago

There are several ways to calculate the distance from Shonai to Yonago. Here are two standard methods:

Vincenty's formula (applied above)
  • 427.815 miles
  • 688.502 kilometers
  • 371.761 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 427.339 miles
  • 687.736 kilometers
  • 371.348 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shonai to Yonago?

The estimated flight time from Shonai Airport to Miho-Yonago Airport is 1 hour and 18 minutes.

What is the time difference between Shonai and Yonago?

There is no time difference between Shonai and Yonago.

Flight carbon footprint between Shonai Airport (SYO) and Miho-Yonago Airport (YGJ)

On average, flying from Shonai to Yonago generates about 88 kg of CO2 per passenger, and 88 kilograms equals 194 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shonai to Yonago

See the map of the shortest flight path between Shonai Airport (SYO) and Miho-Yonago Airport (YGJ).

Airport information

Origin Shonai Airport
City: Shonai
Country: Japan Flag of Japan
IATA Code: SYO
ICAO Code: RJSY
Coordinates: 38°48′43″N, 139°47′13″E
Destination Miho-Yonago Airport
City: Yonago
Country: Japan Flag of Japan
IATA Code: YGJ
ICAO Code: RJOH
Coordinates: 35°29′31″N, 133°14′9″E