Air Miles Calculator logo

How far is Yanji from Uray?

The distance between Uray (Uray Airport) and Yanji (Yanji Chaoyangchuan International Airport) is 2890 miles / 4651 kilometers / 2511 nautical miles.

The driving distance from Uray (URJ) to Yanji (YNJ) is 4407 miles / 7092 kilometers, and travel time by car is about 86 hours 33 minutes.

Uray Airport – Yanji Chaoyangchuan International Airport

Distance arrow
2890
Miles
Distance arrow
4651
Kilometers
Distance arrow
2511
Nautical miles

Search flights

Distance from Uray to Yanji

There are several ways to calculate the distance from Uray to Yanji. Here are two standard methods:

Vincenty's formula (applied above)
  • 2890.002 miles
  • 4651.007 kilometers
  • 2511.343 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2882.237 miles
  • 4638.511 kilometers
  • 2504.596 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Uray to Yanji?

The estimated flight time from Uray Airport to Yanji Chaoyangchuan International Airport is 5 hours and 58 minutes.

Flight carbon footprint between Uray Airport (URJ) and Yanji Chaoyangchuan International Airport (YNJ)

On average, flying from Uray to Yanji generates about 321 kg of CO2 per passenger, and 321 kilograms equals 708 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Uray to Yanji

See the map of the shortest flight path between Uray Airport (URJ) and Yanji Chaoyangchuan International Airport (YNJ).

Airport information

Origin Uray Airport
City: Uray
Country: Russia Flag of Russia
IATA Code: URJ
ICAO Code: USHU
Coordinates: 60°6′11″N, 64°49′36″E
Destination Yanji Chaoyangchuan International Airport
City: Yanji
Country: China Flag of China
IATA Code: YNJ
ICAO Code: ZYYJ
Coordinates: 42°52′58″N, 129°27′3″E