Air Miles Calculator logo

How far is Yanji from Mirny?

The distance between Mirny (Mirny Airport) and Yanji (Yanji Chaoyangchuan International Airport) is 1496 miles / 2408 kilometers / 1300 nautical miles.

The driving distance from Mirny (MJZ) to Yanji (YNJ) is 2108 miles / 3393 kilometers, and travel time by car is about 63 hours 13 minutes.

Mirny Airport – Yanji Chaoyangchuan International Airport

Distance arrow
1496
Miles
Distance arrow
2408
Kilometers
Distance arrow
1300
Nautical miles

Search flights

Distance from Mirny to Yanji

There are several ways to calculate the distance from Mirny to Yanji. Here are two standard methods:

Vincenty's formula (applied above)
  • 1496.063 miles
  • 2407.680 kilometers
  • 1300.043 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1494.310 miles
  • 2404.859 kilometers
  • 1298.520 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Mirny to Yanji?

The estimated flight time from Mirny Airport to Yanji Chaoyangchuan International Airport is 3 hours and 19 minutes.

Flight carbon footprint between Mirny Airport (MJZ) and Yanji Chaoyangchuan International Airport (YNJ)

On average, flying from Mirny to Yanji generates about 179 kg of CO2 per passenger, and 179 kilograms equals 395 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Mirny to Yanji

See the map of the shortest flight path between Mirny Airport (MJZ) and Yanji Chaoyangchuan International Airport (YNJ).

Airport information

Origin Mirny Airport
City: Mirny
Country: Russia Flag of Russia
IATA Code: MJZ
ICAO Code: UERR
Coordinates: 62°32′4″N, 114°2′20″E
Destination Yanji Chaoyangchuan International Airport
City: Yanji
Country: China Flag of China
IATA Code: YNJ
ICAO Code: ZYYJ
Coordinates: 42°52′58″N, 129°27′3″E