Air Miles Calculator logo

How far is Baishan from Novy Urengoy?

The distance between Novy Urengoy (Novy Urengoy Airport) and Baishan (Changbaishan Airport) is 2543 miles / 4092 kilometers / 2210 nautical miles.

The driving distance from Novy Urengoy (NUX) to Baishan (NBS) is 4854 miles / 7811 kilometers, and travel time by car is about 95 hours 27 minutes.

Novy Urengoy Airport – Changbaishan Airport

Distance arrow
2543
Miles
Distance arrow
4092
Kilometers
Distance arrow
2210
Nautical miles

Search flights

Distance from Novy Urengoy to Baishan

There are several ways to calculate the distance from Novy Urengoy to Baishan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2542.818 miles
  • 4092.268 kilometers
  • 2209.648 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2537.073 miles
  • 4083.024 kilometers
  • 2204.656 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Novy Urengoy to Baishan?

The estimated flight time from Novy Urengoy Airport to Changbaishan Airport is 5 hours and 18 minutes.

Flight carbon footprint between Novy Urengoy Airport (NUX) and Changbaishan Airport (NBS)

On average, flying from Novy Urengoy to Baishan generates about 280 kg of CO2 per passenger, and 280 kilograms equals 618 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Novy Urengoy to Baishan

See the map of the shortest flight path between Novy Urengoy Airport (NUX) and Changbaishan Airport (NBS).

Airport information

Origin Novy Urengoy Airport
City: Novy Urengoy
Country: Russia Flag of Russia
IATA Code: NUX
ICAO Code: USMU
Coordinates: 66°4′9″N, 76°31′13″E
Destination Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E