Air Miles Calculator logo

How far is Jiagedaqi from Batagay-Alyta?

The distance between Batagay-Alyta (Sakkyryr Airport) and Jiagedaqi (Jiagedaqi Airport) is 1225 miles / 1971 kilometers / 1064 nautical miles.

The driving distance from Batagay-Alyta (SUK) to Jiagedaqi (JGD) is 1974 miles / 3177 kilometers, and travel time by car is about 68 hours 13 minutes.

Sakkyryr Airport – Jiagedaqi Airport

Distance arrow
1225
Miles
Distance arrow
1971
Kilometers
Distance arrow
1064
Nautical miles

Search flights

Distance from Batagay-Alyta to Jiagedaqi

There are several ways to calculate the distance from Batagay-Alyta to Jiagedaqi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1224.850 miles
  • 1971.205 kilometers
  • 1064.366 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1222.612 miles
  • 1967.604 kilometers
  • 1062.421 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Batagay-Alyta to Jiagedaqi?

The estimated flight time from Sakkyryr Airport to Jiagedaqi Airport is 2 hours and 49 minutes.

Flight carbon footprint between Sakkyryr Airport (SUK) and Jiagedaqi Airport (JGD)

On average, flying from Batagay-Alyta to Jiagedaqi generates about 162 kg of CO2 per passenger, and 162 kilograms equals 358 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Batagay-Alyta to Jiagedaqi

See the map of the shortest flight path between Sakkyryr Airport (SUK) and Jiagedaqi Airport (JGD).

Airport information

Origin Sakkyryr Airport
City: Batagay-Alyta
Country: Russia Flag of Russia
IATA Code: SUK
ICAO Code: UEBS
Coordinates: 67°47′31″N, 130°23′38″E
Destination Jiagedaqi Airport
City: Jiagedaqi
Country: China Flag of China
IATA Code: JGD
ICAO Code: ZYJD
Coordinates: 50°22′17″N, 124°7′3″E