Air Miles Calculator logo

How far is Amakusa from Shonai?

The distance between Shonai (Shonai Airport) and Amakusa (Amakusa Airfield) is 695 miles / 1118 kilometers / 604 nautical miles.

The driving distance from Shonai (SYO) to Amakusa (AXJ) is 909 miles / 1463 kilometers, and travel time by car is about 17 hours 37 minutes.

Shonai Airport – Amakusa Airfield

Distance arrow
695
Miles
Distance arrow
1118
Kilometers
Distance arrow
604
Nautical miles

Search flights

Distance from Shonai to Amakusa

There are several ways to calculate the distance from Shonai to Amakusa. Here are two standard methods:

Vincenty's formula (applied above)
  • 694.973 miles
  • 1118.451 kilometers
  • 603.915 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 694.625 miles
  • 1117.890 kilometers
  • 603.612 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shonai to Amakusa?

The estimated flight time from Shonai Airport to Amakusa Airfield is 1 hour and 48 minutes.

What is the time difference between Shonai and Amakusa?

There is no time difference between Shonai and Amakusa.

Flight carbon footprint between Shonai Airport (SYO) and Amakusa Airfield (AXJ)

On average, flying from Shonai to Amakusa generates about 124 kg of CO2 per passenger, and 124 kilograms equals 273 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shonai to Amakusa

See the map of the shortest flight path between Shonai Airport (SYO) and Amakusa Airfield (AXJ).

Airport information

Origin Shonai Airport
City: Shonai
Country: Japan Flag of Japan
IATA Code: SYO
ICAO Code: RJSY
Coordinates: 38°48′43″N, 139°47′13″E
Destination Amakusa Airfield
City: Amakusa
Country: Japan Flag of Japan
IATA Code: AXJ
ICAO Code: RJDA
Coordinates: 32°28′56″N, 130°9′32″E