Air Miles Calculator logo

How far is Jiagedaqi from Abakan?

The distance between Abakan (Abakan International Airport) and Jiagedaqi (Jiagedaqi Airport) is 1401 miles / 2255 kilometers / 1218 nautical miles.

The driving distance from Abakan (ABA) to Jiagedaqi (JGD) is 2151 miles / 3461 kilometers, and travel time by car is about 49 hours 49 minutes.

Abakan International Airport – Jiagedaqi Airport

Distance arrow
1401
Miles
Distance arrow
2255
Kilometers
Distance arrow
1218
Nautical miles

Search flights

Distance from Abakan to Jiagedaqi

There are several ways to calculate the distance from Abakan to Jiagedaqi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1401.488 miles
  • 2255.476 kilometers
  • 1217.859 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1397.105 miles
  • 2248.422 kilometers
  • 1214.051 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Abakan to Jiagedaqi?

The estimated flight time from Abakan International Airport to Jiagedaqi Airport is 3 hours and 9 minutes.

Flight carbon footprint between Abakan International Airport (ABA) and Jiagedaqi Airport (JGD)

On average, flying from Abakan to Jiagedaqi generates about 173 kg of CO2 per passenger, and 173 kilograms equals 382 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Abakan to Jiagedaqi

See the map of the shortest flight path between Abakan International Airport (ABA) and Jiagedaqi Airport (JGD).

Airport information

Origin Abakan International Airport
City: Abakan
Country: Russia Flag of Russia
IATA Code: ABA
ICAO Code: UNAA
Coordinates: 53°44′24″N, 91°23′6″E
Destination Jiagedaqi Airport
City: Jiagedaqi
Country: China Flag of China
IATA Code: JGD
ICAO Code: ZYJD
Coordinates: 50°22′17″N, 124°7′3″E