Air Miles Calculator logo

How far is Manzhouli from Zyryanka?

The distance between Zyryanka (Zyryanka Airport) and Manzhouli (Manzhouli Xijiao Airport) is 1634 miles / 2629 kilometers / 1420 nautical miles.

The driving distance from Zyryanka (ZKP) to Manzhouli (NZH) is 2313 miles / 3722 kilometers, and travel time by car is about 69 hours 48 minutes.

Zyryanka Airport – Manzhouli Xijiao Airport

Distance arrow
1634
Miles
Distance arrow
2629
Kilometers
Distance arrow
1420
Nautical miles

Search flights

Distance from Zyryanka to Manzhouli

There are several ways to calculate the distance from Zyryanka to Manzhouli. Here are two standard methods:

Vincenty's formula (applied above)
  • 1633.561 miles
  • 2628.962 kilometers
  • 1419.526 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1629.365 miles
  • 2622.209 kilometers
  • 1415.879 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Zyryanka to Manzhouli?

The estimated flight time from Zyryanka Airport to Manzhouli Xijiao Airport is 3 hours and 35 minutes.

Flight carbon footprint between Zyryanka Airport (ZKP) and Manzhouli Xijiao Airport (NZH)

On average, flying from Zyryanka to Manzhouli generates about 188 kg of CO2 per passenger, and 188 kilograms equals 415 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Zyryanka to Manzhouli

See the map of the shortest flight path between Zyryanka Airport (ZKP) and Manzhouli Xijiao Airport (NZH).

Airport information

Origin Zyryanka Airport
City: Zyryanka
Country: Russia Flag of Russia
IATA Code: ZKP
ICAO Code: UESU
Coordinates: 65°44′12″N, 150°42′18″E
Destination Manzhouli Xijiao Airport
City: Manzhouli
Country: China Flag of China
IATA Code: NZH
ICAO Code: ZBMZ
Coordinates: 49°34′0″N, 117°19′48″E