Air Miles Calculator logo

How far is Batagay from Nefteyugansk?

The distance between Nefteyugansk (Nefteyugansk Airport) and Batagay (Batagay Airport) is 1829 miles / 2944 kilometers / 1590 nautical miles.

The driving distance from Nefteyugansk (NFG) to Batagay (BQJ) is 4975 miles / 8007 kilometers, and travel time by car is about 130 hours 9 minutes.

Nefteyugansk Airport – Batagay Airport

Distance arrow
1829
Miles
Distance arrow
2944
Kilometers
Distance arrow
1590
Nautical miles

Search flights

Distance from Nefteyugansk to Batagay

There are several ways to calculate the distance from Nefteyugansk to Batagay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1829.386 miles
  • 2944.112 kilometers
  • 1589.693 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1822.520 miles
  • 2933.061 kilometers
  • 1583.726 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nefteyugansk to Batagay?

The estimated flight time from Nefteyugansk Airport to Batagay Airport is 3 hours and 57 minutes.

Flight carbon footprint between Nefteyugansk Airport (NFG) and Batagay Airport (BQJ)

On average, flying from Nefteyugansk to Batagay generates about 202 kg of CO2 per passenger, and 202 kilograms equals 446 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nefteyugansk to Batagay

See the map of the shortest flight path between Nefteyugansk Airport (NFG) and Batagay Airport (BQJ).

Airport information

Origin Nefteyugansk Airport
City: Nefteyugansk
Country: Russia Flag of Russia
IATA Code: NFG
ICAO Code: USRN
Coordinates: 61°6′29″N, 72°39′0″E
Destination Batagay Airport
City: Batagay
Country: Russia Flag of Russia
IATA Code: BQJ
ICAO Code: UEBB
Coordinates: 67°38′52″N, 134°41′42″E