Air Miles Calculator logo

Distance between Qiqihar (NDG) and Nanjing (NKG)

Flight distance from Qiqihar to Nanjing (Qiqihar Sanjiazi Airport – Nanjing Lukou International Airport) is 1102 miles / 1774 kilometers / 958 nautical miles. Estimated flight time is 2 hours 35 minutes.

Driving distance from Qiqihar (NDG) to Nanjing (NKG) is 1413 miles / 2274 kilometers and travel time by car is about 25 hours 51 minutes.

Qiqihar – Nanjing

Distance arrow
1102
Miles
Distance arrow
1774
Kilometers
Distance arrow
958
Nautical miles

How far is Nanjing from Qiqihar?

There are several ways to calculate distances between Los Angeles and Chicago. Here are two common methods:

Vincenty's formula (applied above)
  • 1102.159 miles
  • 1773.754 kilometers
  • 957.750 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
  • 1103.578 miles
  • 1776.036 kilometers
  • 958.983 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

Flight Duration

Estimated flight time from Qiqihar Sanjiazi Airport to Nanjing Lukou International Airport is 2 hours 35 minutes.

Time difference

There is no time difference between Qiqihar and Nanjing.

Carbon dioxide emissions

On average flying from Qiqihar to Nanjing generates about 157 kg of CO2 per passenger, 157 kilograms is equal to 346 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Qiqihar to Nanjing

Shortest flight path between Qiqihar Sanjiazi Airport (NDG) and Nanjing Lukou International Airport (NKG).

Airport information

Origin Qiqihar Sanjiazi Airport
City: Qiqihar
Country: China Flag of China
IATA Code: NDG
ICAO Code: ZYQQ
Coordinates: 47°14′22″N, 123°55′4″E
Destination Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E