Air Miles Calculator logo

How far is Dayong from Hotan?

The distance between Hotan (Hotan Airport) and Dayong (Zhangjiajie Hehua International Airport) is 1847 miles / 2973 kilometers / 1605 nautical miles.

The driving distance from Hotan (HTN) to Dayong (DYG) is 2466 miles / 3968 kilometers, and travel time by car is about 45 hours 54 minutes.

Hotan Airport – Zhangjiajie Hehua International Airport

Distance arrow
1847
Miles
Distance arrow
2973
Kilometers
Distance arrow
1605
Nautical miles

Search flights

Distance from Hotan to Dayong

There are several ways to calculate the distance from Hotan to Dayong. Here are two standard methods:

Vincenty's formula (applied above)
  • 1847.385 miles
  • 2973.078 kilometers
  • 1605.334 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1844.248 miles
  • 2968.030 kilometers
  • 1602.608 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hotan to Dayong?

The estimated flight time from Hotan Airport to Zhangjiajie Hehua International Airport is 3 hours and 59 minutes.

Flight carbon footprint between Hotan Airport (HTN) and Zhangjiajie Hehua International Airport (DYG)

On average, flying from Hotan to Dayong generates about 204 kg of CO2 per passenger, and 204 kilograms equals 449 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hotan to Dayong

See the map of the shortest flight path between Hotan Airport (HTN) and Zhangjiajie Hehua International Airport (DYG).

Airport information

Origin Hotan Airport
City: Hotan
Country: China Flag of China
IATA Code: HTN
ICAO Code: ZWTN
Coordinates: 37°2′18″N, 79°51′53″E
Destination Zhangjiajie Hehua International Airport
City: Dayong
Country: China Flag of China
IATA Code: DYG
ICAO Code: ZGDY
Coordinates: 29°6′10″N, 110°26′34″E