Air Miles Calculator logo

How far is Yonago from Akita?

The distance between Akita (Akita Airport) and Yonago (Miho-Yonago Airport) is 477 miles / 768 kilometers / 415 nautical miles.

The driving distance from Akita (AXT) to Yonago (YGJ) is 638 miles / 1027 kilometers, and travel time by car is about 12 hours 39 minutes.

Akita Airport – Miho-Yonago Airport

Distance arrow
477
Miles
Distance arrow
768
Kilometers
Distance arrow
415
Nautical miles

Search flights

Distance from Akita to Yonago

There are several ways to calculate the distance from Akita to Yonago. Here are two standard methods:

Vincenty's formula (applied above)
  • 477.097 miles
  • 767.814 kilometers
  • 414.586 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 476.688 miles
  • 767.156 kilometers
  • 414.231 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Akita to Yonago?

The estimated flight time from Akita Airport to Miho-Yonago Airport is 1 hour and 24 minutes.

What is the time difference between Akita and Yonago?

There is no time difference between Akita and Yonago.

Flight carbon footprint between Akita Airport (AXT) and Miho-Yonago Airport (YGJ)

On average, flying from Akita to Yonago generates about 95 kg of CO2 per passenger, and 95 kilograms equals 210 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Akita to Yonago

See the map of the shortest flight path between Akita Airport (AXT) and Miho-Yonago Airport (YGJ).

Airport information

Origin Akita Airport
City: Akita
Country: Japan Flag of Japan
IATA Code: AXT
ICAO Code: RJSK
Coordinates: 39°36′56″N, 140°13′8″E
Destination Miho-Yonago Airport
City: Yonago
Country: Japan Flag of Japan
IATA Code: YGJ
ICAO Code: RJOH
Coordinates: 35°29′31″N, 133°14′9″E