Air Miles Calculator logo

How far is Dayong from Jixi?

The distance between Jixi (Jixi Xingkaihu Airport) and Dayong (Zhangjiajie Hehua International Airport) is 1590 miles / 2559 kilometers / 1382 nautical miles.

The driving distance from Jixi (JXA) to Dayong (DYG) is 1899 miles / 3056 kilometers, and travel time by car is about 34 hours 20 minutes.

Jixi Xingkaihu Airport – Zhangjiajie Hehua International Airport

Distance arrow
1590
Miles
Distance arrow
2559
Kilometers
Distance arrow
1382
Nautical miles

Search flights

Distance from Jixi to Dayong

There are several ways to calculate the distance from Jixi to Dayong. Here are two standard methods:

Vincenty's formula (applied above)
  • 1589.868 miles
  • 2558.644 kilometers
  • 1381.557 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1589.489 miles
  • 2558.035 kilometers
  • 1381.228 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jixi to Dayong?

The estimated flight time from Jixi Xingkaihu Airport to Zhangjiajie Hehua International Airport is 3 hours and 30 minutes.

What is the time difference between Jixi and Dayong?

There is no time difference between Jixi and Dayong.

Flight carbon footprint between Jixi Xingkaihu Airport (JXA) and Zhangjiajie Hehua International Airport (DYG)

On average, flying from Jixi to Dayong generates about 185 kg of CO2 per passenger, and 185 kilograms equals 408 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jixi to Dayong

See the map of the shortest flight path between Jixi Xingkaihu Airport (JXA) and Zhangjiajie Hehua International Airport (DYG).

Airport information

Origin Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E
Destination Zhangjiajie Hehua International Airport
City: Dayong
Country: China Flag of China
IATA Code: DYG
ICAO Code: ZGDY
Coordinates: 29°6′10″N, 110°26′34″E