Air Miles Calculator logo

How far is Harbin from Qiqihar?

The distance between Qiqihar (Qiqihar Sanjiazi Airport) and Harbin (Harbin Taiping International Airport) is 158 miles / 254 kilometers / 137 nautical miles.

The driving distance from Qiqihar (NDG) to Harbin (HRB) is 216 miles / 347 kilometers, and travel time by car is about 4 hours 9 minutes.

Qiqihar Sanjiazi Airport – Harbin Taiping International Airport

Distance arrow
158
Miles
Distance arrow
254
Kilometers
Distance arrow
137
Nautical miles

Search flights

Distance from Qiqihar to Harbin

There are several ways to calculate the distance from Qiqihar to Harbin. Here are two standard methods:

Vincenty's formula (applied above)
  • 157.674 miles
  • 253.752 kilometers
  • 137.015 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 157.473 miles
  • 253.429 kilometers
  • 136.841 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Qiqihar to Harbin?

The estimated flight time from Qiqihar Sanjiazi Airport to Harbin Taiping International Airport is 47 minutes.

What is the time difference between Qiqihar and Harbin?

There is no time difference between Qiqihar and Harbin.

Flight carbon footprint between Qiqihar Sanjiazi Airport (NDG) and Harbin Taiping International Airport (HRB)

On average, flying from Qiqihar to Harbin generates about 48 kg of CO2 per passenger, and 48 kilograms equals 106 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Qiqihar to Harbin

See the map of the shortest flight path between Qiqihar Sanjiazi Airport (NDG) and Harbin Taiping International Airport (HRB).

Airport information

Origin Qiqihar Sanjiazi Airport
City: Qiqihar
Country: China Flag of China
IATA Code: NDG
ICAO Code: ZYQQ
Coordinates: 47°14′22″N, 123°55′4″E
Destination Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E