Air Miles Calculator logo

How far is Arxan from Novy Urengoy?

The distance between Novy Urengoy (Novy Urengoy Airport) and Arxan (Arxan Yi'ershi Airport) is 2033 miles / 3271 kilometers / 1766 nautical miles.

The driving distance from Novy Urengoy (NUX) to Arxan (YIE) is 3690 miles / 5939 kilometers, and travel time by car is about 88 hours 35 minutes.

Novy Urengoy Airport – Arxan Yi'ershi Airport

Distance arrow
2033
Miles
Distance arrow
3271
Kilometers
Distance arrow
1766
Nautical miles

Search flights

Distance from Novy Urengoy to Arxan

There are several ways to calculate the distance from Novy Urengoy to Arxan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2032.591 miles
  • 3271.138 kilometers
  • 1766.273 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2027.343 miles
  • 3262.692 kilometers
  • 1761.712 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Novy Urengoy to Arxan?

The estimated flight time from Novy Urengoy Airport to Arxan Yi'ershi Airport is 4 hours and 20 minutes.

Flight carbon footprint between Novy Urengoy Airport (NUX) and Arxan Yi'ershi Airport (YIE)

On average, flying from Novy Urengoy to Arxan generates about 221 kg of CO2 per passenger, and 221 kilograms equals 488 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Novy Urengoy to Arxan

See the map of the shortest flight path between Novy Urengoy Airport (NUX) and Arxan Yi'ershi Airport (YIE).

Airport information

Origin Novy Urengoy Airport
City: Novy Urengoy
Country: Russia Flag of Russia
IATA Code: NUX
ICAO Code: USMU
Coordinates: 66°4′9″N, 76°31′13″E
Destination Arxan Yi'ershi Airport
City: Arxan
Country: China Flag of China
IATA Code: YIE
ICAO Code: ZBES
Coordinates: 47°18′38″N, 119°54′42″E