Air Miles Calculator logo

How far is Yamagata from Niigata?

The distance between Niigata (Niigata Airport) and Yamagata (Yamagata Airport) is 75 miles / 121 kilometers / 65 nautical miles.

The driving distance from Niigata (KIJ) to Yamagata (GAJ) is 107 miles / 173 kilometers, and travel time by car is about 2 hours 12 minutes.

Niigata Airport – Yamagata Airport

Distance arrow
75
Miles
Distance arrow
121
Kilometers
Distance arrow
65
Nautical miles

Search flights

Distance from Niigata to Yamagata

There are several ways to calculate the distance from Niigata to Yamagata. Here are two standard methods:

Vincenty's formula (applied above)
  • 74.966 miles
  • 120.646 kilometers
  • 65.143 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 74.841 miles
  • 120.445 kilometers
  • 65.035 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Niigata to Yamagata?

The estimated flight time from Niigata Airport to Yamagata Airport is 38 minutes.

What is the time difference between Niigata and Yamagata?

There is no time difference between Niigata and Yamagata.

Flight carbon footprint between Niigata Airport (KIJ) and Yamagata Airport (GAJ)

On average, flying from Niigata to Yamagata generates about 36 kg of CO2 per passenger, and 36 kilograms equals 79 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Niigata to Yamagata

See the map of the shortest flight path between Niigata Airport (KIJ) and Yamagata Airport (GAJ).

Airport information

Origin Niigata Airport
City: Niigata
Country: Japan Flag of Japan
IATA Code: KIJ
ICAO Code: RJSN
Coordinates: 37°57′21″N, 139°7′15″E
Destination Yamagata Airport
City: Yamagata
Country: Japan Flag of Japan
IATA Code: GAJ
ICAO Code: RJSC
Coordinates: 38°24′42″N, 140°22′15″E