Air Miles Calculator logo

How far is Jixi from Yanji?

The distance between Yanji (Yanji Chaoyangchuan International Airport) and Jixi (Jixi Xingkaihu Airport) is 188 miles / 302 kilometers / 163 nautical miles.

The driving distance from Yanji (YNJ) to Jixi (JXA) is 304 miles / 490 kilometers, and travel time by car is about 5 hours 35 minutes.

Yanji Chaoyangchuan International Airport – Jixi Xingkaihu Airport

Distance arrow
188
Miles
Distance arrow
302
Kilometers
Distance arrow
163
Nautical miles

Search flights

Distance from Yanji to Jixi

There are several ways to calculate the distance from Yanji to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 187.622 miles
  • 301.948 kilometers
  • 163.039 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 187.620 miles
  • 301.945 kilometers
  • 163.037 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yanji to Jixi?

The estimated flight time from Yanji Chaoyangchuan International Airport to Jixi Xingkaihu Airport is 51 minutes.

What is the time difference between Yanji and Jixi?

There is no time difference between Yanji and Jixi.

Flight carbon footprint between Yanji Chaoyangchuan International Airport (YNJ) and Jixi Xingkaihu Airport (JXA)

On average, flying from Yanji to Jixi generates about 53 kg of CO2 per passenger, and 53 kilograms equals 116 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yanji to Jixi

See the map of the shortest flight path between Yanji Chaoyangchuan International Airport (YNJ) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Yanji Chaoyangchuan International Airport
City: Yanji
Country: China Flag of China
IATA Code: YNJ
ICAO Code: ZYYJ
Coordinates: 42°52′58″N, 129°27′3″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E