Air Miles Calculator logo

Distance between Beijing (PEK) and Huaihua (HJJ)

Flight distance from Beijing to Huaihua (Beijing Capital International Airport – Huaihua Zhijiang Airport) is 956 miles / 1539 kilometers / 831 nautical miles. Estimated flight time is 2 hours 18 minutes.

Driving distance from Beijing (PEK) to Huaihua (HJJ) is 1104 miles / 1777 kilometers and travel time by car is about 20 hours 20 minutes.

Beijing – Huaihua

Distance arrow
956
Miles
Distance arrow
1539
Kilometers
Distance arrow
831
Nautical miles

How far is Huaihua from Beijing?

There are several ways to calculate distances between Los Angeles and Chicago. Here are two common methods:

Vincenty's formula (applied above)
  • 956.223 miles
  • 1538.892 kilometers
  • 830.935 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
  • 957.845 miles
  • 1541.501 kilometers
  • 832.344 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

Flight Duration

Estimated flight time from Beijing Capital International Airport to Huaihua Zhijiang Airport is 2 hours 18 minutes.

Time difference

There is no time difference between Beijing and Huaihua.

Carbon dioxide emissions

On average flying from Beijing to Huaihua generates about 148 kg of CO2 per passenger, 148 kilograms is equal to 326 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Beijing to Huaihua

Shortest flight path between Beijing Capital International Airport (PEK) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Beijing Capital International Airport
City: Beijing
Country: China Flag of China
IATA Code: PEK
ICAO Code: ZBAA
Coordinates: 40°4′48″N, 116°35′5″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E