Air Miles Calculator logo

How far is Yamagata from Nagoya?

The distance between Nagoya (Nagoya Airfield) and Yamagata (Yamagata Airport) is 290 miles / 466 kilometers / 252 nautical miles.

The driving distance from Nagoya (NKM) to Yamagata (GAJ) is 401 miles / 645 kilometers, and travel time by car is about 7 hours 49 minutes.

Nagoya Airfield – Yamagata Airport

Distance arrow
290
Miles
Distance arrow
466
Kilometers
Distance arrow
252
Nautical miles

Search flights

Distance from Nagoya to Yamagata

There are several ways to calculate the distance from Nagoya to Yamagata. Here are two standard methods:

Vincenty's formula (applied above)
  • 289.596 miles
  • 466.060 kilometers
  • 251.652 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 289.630 miles
  • 466.114 kilometers
  • 251.681 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nagoya to Yamagata?

The estimated flight time from Nagoya Airfield to Yamagata Airport is 1 hour and 2 minutes.

What is the time difference between Nagoya and Yamagata?

There is no time difference between Nagoya and Yamagata.

Flight carbon footprint between Nagoya Airfield (NKM) and Yamagata Airport (GAJ)

On average, flying from Nagoya to Yamagata generates about 68 kg of CO2 per passenger, and 68 kilograms equals 149 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nagoya to Yamagata

See the map of the shortest flight path between Nagoya Airfield (NKM) and Yamagata Airport (GAJ).

Airport information

Origin Nagoya Airfield
City: Nagoya
Country: Japan Flag of Japan
IATA Code: NKM
ICAO Code: RJNA
Coordinates: 35°15′18″N, 136°55′26″E
Destination Yamagata Airport
City: Yamagata
Country: Japan Flag of Japan
IATA Code: GAJ
ICAO Code: RJSC
Coordinates: 38°24′42″N, 140°22′15″E