Air Miles Calculator logo

How far is Baishan from Nefteyugansk?

The distance between Nefteyugansk (Nefteyugansk Airport) and Baishan (Changbaishan Airport) is 2602 miles / 4188 kilometers / 2261 nautical miles.

The driving distance from Nefteyugansk (NFG) to Baishan (NBS) is 4357 miles / 7012 kilometers, and travel time by car is about 83 hours 52 minutes.

Nefteyugansk Airport – Changbaishan Airport

Distance arrow
2602
Miles
Distance arrow
4188
Kilometers
Distance arrow
2261
Nautical miles

Search flights

Distance from Nefteyugansk to Baishan

There are several ways to calculate the distance from Nefteyugansk to Baishan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2602.447 miles
  • 4188.233 kilometers
  • 2261.465 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2596.027 miles
  • 4177.901 kilometers
  • 2255.886 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nefteyugansk to Baishan?

The estimated flight time from Nefteyugansk Airport to Changbaishan Airport is 5 hours and 25 minutes.

Flight carbon footprint between Nefteyugansk Airport (NFG) and Changbaishan Airport (NBS)

On average, flying from Nefteyugansk to Baishan generates about 287 kg of CO2 per passenger, and 287 kilograms equals 633 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nefteyugansk to Baishan

See the map of the shortest flight path between Nefteyugansk Airport (NFG) and Changbaishan Airport (NBS).

Airport information

Origin Nefteyugansk Airport
City: Nefteyugansk
Country: Russia Flag of Russia
IATA Code: NFG
ICAO Code: USRN
Coordinates: 61°6′29″N, 72°39′0″E
Destination Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E