Air Miles Calculator logo

How far is Niigata from Yamagata?

The distance between Yamagata (Yamagata Airport) and Niigata (Niigata Airport) is 75 miles / 121 kilometers / 65 nautical miles.

The driving distance from Yamagata (GAJ) to Niigata (KIJ) is 107 miles / 172 kilometers, and travel time by car is about 2 hours 11 minutes.

Yamagata Airport – Niigata Airport

Distance arrow
75
Miles
Distance arrow
121
Kilometers
Distance arrow
65
Nautical miles

Search flights

Distance from Yamagata to Niigata

There are several ways to calculate the distance from Yamagata to Niigata. Here are two standard methods:

Vincenty's formula (applied above)
  • 74.966 miles
  • 120.646 kilometers
  • 65.143 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 74.841 miles
  • 120.445 kilometers
  • 65.035 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yamagata to Niigata?

The estimated flight time from Yamagata Airport to Niigata Airport is 38 minutes.

What is the time difference between Yamagata and Niigata?

There is no time difference between Yamagata and Niigata.

Flight carbon footprint between Yamagata Airport (GAJ) and Niigata Airport (KIJ)

On average, flying from Yamagata to Niigata generates about 36 kg of CO2 per passenger, and 36 kilograms equals 79 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yamagata to Niigata

See the map of the shortest flight path between Yamagata Airport (GAJ) and Niigata Airport (KIJ).

Airport information

Origin Yamagata Airport
City: Yamagata
Country: Japan Flag of Japan
IATA Code: GAJ
ICAO Code: RJSC
Coordinates: 38°24′42″N, 140°22′15″E
Destination Niigata Airport
City: Niigata
Country: Japan Flag of Japan
IATA Code: KIJ
ICAO Code: RJSN
Coordinates: 37°57′21″N, 139°7′15″E