Air Miles Calculator logo

How far is Jiagedaqi from Novy Urengoy?

The distance between Novy Urengoy (Novy Urengoy Airport) and Jiagedaqi (Jiagedaqi Airport) is 1978 miles / 3183 kilometers / 1719 nautical miles.

The driving distance from Novy Urengoy (NUX) to Jiagedaqi (JGD) is 3828 miles / 6160 kilometers, and travel time by car is about 92 hours 35 minutes.

Novy Urengoy Airport – Jiagedaqi Airport

Distance arrow
1978
Miles
Distance arrow
3183
Kilometers
Distance arrow
1719
Nautical miles

Search flights

Distance from Novy Urengoy to Jiagedaqi

There are several ways to calculate the distance from Novy Urengoy to Jiagedaqi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1978.034 miles
  • 3183.337 kilometers
  • 1718.864 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1972.224 miles
  • 3173.986 kilometers
  • 1713.815 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Novy Urengoy to Jiagedaqi?

The estimated flight time from Novy Urengoy Airport to Jiagedaqi Airport is 4 hours and 14 minutes.

Flight carbon footprint between Novy Urengoy Airport (NUX) and Jiagedaqi Airport (JGD)

On average, flying from Novy Urengoy to Jiagedaqi generates about 216 kg of CO2 per passenger, and 216 kilograms equals 475 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Novy Urengoy to Jiagedaqi

See the map of the shortest flight path between Novy Urengoy Airport (NUX) and Jiagedaqi Airport (JGD).

Airport information

Origin Novy Urengoy Airport
City: Novy Urengoy
Country: Russia Flag of Russia
IATA Code: NUX
ICAO Code: USMU
Coordinates: 66°4′9″N, 76°31′13″E
Destination Jiagedaqi Airport
City: Jiagedaqi
Country: China Flag of China
IATA Code: JGD
ICAO Code: ZYJD
Coordinates: 50°22′17″N, 124°7′3″E