Air Miles Calculator logo

How far is Manzhouli from Huaihua?

The distance between Huaihua (Huaihua Zhijiang Airport) and Manzhouli (Manzhouli Xijiao Airport) is 1579 miles / 2541 kilometers / 1372 nautical miles.

The driving distance from Huaihua (HJJ) to Manzhouli (NZH) is 2116 miles / 3405 kilometers, and travel time by car is about 40 hours 28 minutes.

Huaihua Zhijiang Airport – Manzhouli Xijiao Airport

Distance arrow
1579
Miles
Distance arrow
2541
Kilometers
Distance arrow
1372
Nautical miles

Search flights

Distance from Huaihua to Manzhouli

There are several ways to calculate the distance from Huaihua to Manzhouli. Here are two standard methods:

Vincenty's formula (applied above)
  • 1579.213 miles
  • 2541.498 kilometers
  • 1372.299 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1581.434 miles
  • 2545.071 kilometers
  • 1374.229 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Huaihua to Manzhouli?

The estimated flight time from Huaihua Zhijiang Airport to Manzhouli Xijiao Airport is 3 hours and 29 minutes.

What is the time difference between Huaihua and Manzhouli?

There is no time difference between Huaihua and Manzhouli.

Flight carbon footprint between Huaihua Zhijiang Airport (HJJ) and Manzhouli Xijiao Airport (NZH)

On average, flying from Huaihua to Manzhouli generates about 185 kg of CO2 per passenger, and 185 kilograms equals 407 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Huaihua to Manzhouli

See the map of the shortest flight path between Huaihua Zhijiang Airport (HJJ) and Manzhouli Xijiao Airport (NZH).

Airport information

Origin Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E
Destination Manzhouli Xijiao Airport
City: Manzhouli
Country: China Flag of China
IATA Code: NZH
ICAO Code: ZBMZ
Coordinates: 49°34′0″N, 117°19′48″E