Air Miles Calculator logo

How far is Jiagedaqi from Nefteyugansk?

The distance between Nefteyugansk (Nefteyugansk Airport) and Jiagedaqi (Jiagedaqi Airport) is 2077 miles / 3343 kilometers / 1805 nautical miles.

The driving distance from Nefteyugansk (NFG) to Jiagedaqi (JGD) is 3640 miles / 5858 kilometers, and travel time by car is about 81 hours 31 minutes.

Nefteyugansk Airport – Jiagedaqi Airport

Distance arrow
2077
Miles
Distance arrow
3343
Kilometers
Distance arrow
1805
Nautical miles

Search flights

Distance from Nefteyugansk to Jiagedaqi

There are several ways to calculate the distance from Nefteyugansk to Jiagedaqi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2077.067 miles
  • 3342.716 kilometers
  • 1804.922 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2070.599 miles
  • 3332.306 kilometers
  • 1799.301 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nefteyugansk to Jiagedaqi?

The estimated flight time from Nefteyugansk Airport to Jiagedaqi Airport is 4 hours and 25 minutes.

Flight carbon footprint between Nefteyugansk Airport (NFG) and Jiagedaqi Airport (JGD)

On average, flying from Nefteyugansk to Jiagedaqi generates about 226 kg of CO2 per passenger, and 226 kilograms equals 499 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nefteyugansk to Jiagedaqi

See the map of the shortest flight path between Nefteyugansk Airport (NFG) and Jiagedaqi Airport (JGD).

Airport information

Origin Nefteyugansk Airport
City: Nefteyugansk
Country: Russia Flag of Russia
IATA Code: NFG
ICAO Code: USRN
Coordinates: 61°6′29″N, 72°39′0″E
Destination Jiagedaqi Airport
City: Jiagedaqi
Country: China Flag of China
IATA Code: JGD
ICAO Code: ZYJD
Coordinates: 50°22′17″N, 124°7′3″E