Air Miles Calculator logo

How far is Qiqihar from Shanghai?

The distance between Shanghai (Shanghai Hongqiao International Airport) and Qiqihar (Qiqihar Sanjiazi Airport) is 1115 miles / 1795 kilometers / 969 nautical miles.

The driving distance from Shanghai (SHA) to Qiqihar (NDG) is 1450 miles / 2333 kilometers, and travel time by car is about 26 hours 15 minutes.

Shanghai Hongqiao International Airport – Qiqihar Sanjiazi Airport

Distance arrow
1115
Miles
Distance arrow
1795
Kilometers
Distance arrow
969
Nautical miles

Search flights

Distance from Shanghai to Qiqihar

There are several ways to calculate the distance from Shanghai to Qiqihar. Here are two standard methods:

Vincenty's formula (applied above)
  • 1115.112 miles
  • 1794.599 kilometers
  • 969.006 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1116.791 miles
  • 1797.301 kilometers
  • 970.465 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Qiqihar?

The estimated flight time from Shanghai Hongqiao International Airport to Qiqihar Sanjiazi Airport is 2 hours and 36 minutes.

What is the time difference between Shanghai and Qiqihar?

There is no time difference between Shanghai and Qiqihar.

Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Qiqihar Sanjiazi Airport (NDG)

On average, flying from Shanghai to Qiqihar generates about 158 kg of CO2 per passenger, and 158 kilograms equals 347 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Shanghai to Qiqihar

See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Qiqihar Sanjiazi Airport (NDG).

Airport information

Origin Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E
Destination Qiqihar Sanjiazi Airport
City: Qiqihar
Country: China Flag of China
IATA Code: NDG
ICAO Code: ZYQQ
Coordinates: 47°14′22″N, 123°55′4″E