Air Miles Calculator logo

How far is Nanjing from Burqin?

The distance between Burqin (Burqin Kanas Airport) and Nanjing (Nanjing Lukou International Airport) is 2015 miles / 3243 kilometers / 1751 nautical miles.

The driving distance from Burqin (KJI) to Nanjing (NKG) is 2577 miles / 4147 kilometers, and travel time by car is about 47 hours 19 minutes.

Burqin Kanas Airport – Nanjing Lukou International Airport

Distance arrow
2015
Miles
Distance arrow
3243
Kilometers
Distance arrow
1751
Nautical miles

Search flights

Distance from Burqin to Nanjing

There are several ways to calculate the distance from Burqin to Nanjing. Here are two standard methods:

Vincenty's formula (applied above)
  • 2015.130 miles
  • 3243.037 kilometers
  • 1751.100 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2012.643 miles
  • 3239.036 kilometers
  • 1748.939 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Burqin to Nanjing?

The estimated flight time from Burqin Kanas Airport to Nanjing Lukou International Airport is 4 hours and 18 minutes.

What is the time difference between Burqin and Nanjing?

There is no time difference between Burqin and Nanjing.

Flight carbon footprint between Burqin Kanas Airport (KJI) and Nanjing Lukou International Airport (NKG)

On average, flying from Burqin to Nanjing generates about 219 kg of CO2 per passenger, and 219 kilograms equals 484 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Burqin to Nanjing

See the map of the shortest flight path between Burqin Kanas Airport (KJI) and Nanjing Lukou International Airport (NKG).

Airport information

Origin Burqin Kanas Airport
City: Burqin
Country: China Flag of China
IATA Code: KJI
ICAO Code: ZWKN
Coordinates: 48°13′20″N, 86°59′45″E
Destination Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E