Air Miles Calculator logo

How far is Huaihua from Manzhouli?

The distance between Manzhouli (Manzhouli Xijiao Airport) and Huaihua (Huaihua Zhijiang Airport) is 1579 miles / 2541 kilometers / 1372 nautical miles.

The driving distance from Manzhouli (NZH) to Huaihua (HJJ) is 2077 miles / 3343 kilometers, and travel time by car is about 39 hours 16 minutes.

Manzhouli Xijiao Airport – Huaihua Zhijiang Airport

Distance arrow
1579
Miles
Distance arrow
2541
Kilometers
Distance arrow
1372
Nautical miles

Search flights

Distance from Manzhouli to Huaihua

There are several ways to calculate the distance from Manzhouli to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 1579.213 miles
  • 2541.498 kilometers
  • 1372.299 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1581.434 miles
  • 2545.071 kilometers
  • 1374.229 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Manzhouli to Huaihua?

The estimated flight time from Manzhouli Xijiao Airport to Huaihua Zhijiang Airport is 3 hours and 29 minutes.

What is the time difference between Manzhouli and Huaihua?

There is no time difference between Manzhouli and Huaihua.

Flight carbon footprint between Manzhouli Xijiao Airport (NZH) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Manzhouli to Huaihua generates about 185 kg of CO2 per passenger, and 185 kilograms equals 407 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Manzhouli to Huaihua

See the map of the shortest flight path between Manzhouli Xijiao Airport (NZH) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Manzhouli Xijiao Airport
City: Manzhouli
Country: China Flag of China
IATA Code: NZH
ICAO Code: ZBMZ
Coordinates: 49°34′0″N, 117°19′48″E
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E