Air Miles Calculator logo

How far is Jixi from Nefteyugansk?

The distance between Nefteyugansk (Nefteyugansk Airport) and Jixi (Jixi Xingkaihu Airport) is 2557 miles / 4115 kilometers / 2222 nautical miles.

The driving distance from Nefteyugansk (NFG) to Jixi (JXA) is 4584 miles / 7378 kilometers, and travel time by car is about 87 hours 56 minutes.

Nefteyugansk Airport – Jixi Xingkaihu Airport

Distance arrow
2557
Miles
Distance arrow
4115
Kilometers
Distance arrow
2222
Nautical miles

Search flights

Distance from Nefteyugansk to Jixi

There are several ways to calculate the distance from Nefteyugansk to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2557.044 miles
  • 4115.163 kilometers
  • 2222.010 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2549.912 miles
  • 4103.686 kilometers
  • 2215.813 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nefteyugansk to Jixi?

The estimated flight time from Nefteyugansk Airport to Jixi Xingkaihu Airport is 5 hours and 20 minutes.

Flight carbon footprint between Nefteyugansk Airport (NFG) and Jixi Xingkaihu Airport (JXA)

On average, flying from Nefteyugansk to Jixi generates about 282 kg of CO2 per passenger, and 282 kilograms equals 621 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nefteyugansk to Jixi

See the map of the shortest flight path between Nefteyugansk Airport (NFG) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Nefteyugansk Airport
City: Nefteyugansk
Country: Russia Flag of Russia
IATA Code: NFG
ICAO Code: USRN
Coordinates: 61°6′29″N, 72°39′0″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E