Air Miles Calculator logo

How far is Yanji from Qiemo?

The distance between Qiemo (Qiemo Yudu Airport) and Yanji (Yanji Chaoyangchuan International Airport) is 2309 miles / 3716 kilometers / 2007 nautical miles.

The driving distance from Qiemo (IQM) to Yanji (YNJ) is 2746 miles / 4420 kilometers, and travel time by car is about 52 hours 29 minutes.

Qiemo Yudu Airport – Yanji Chaoyangchuan International Airport

Distance arrow
2309
Miles
Distance arrow
3716
Kilometers
Distance arrow
2007
Nautical miles

Search flights

Distance from Qiemo to Yanji

There are several ways to calculate the distance from Qiemo to Yanji. Here are two standard methods:

Vincenty's formula (applied above)
  • 2309.154 miles
  • 3716.223 kilometers
  • 2006.600 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2303.499 miles
  • 3707.122 kilometers
  • 2001.686 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Qiemo to Yanji?

The estimated flight time from Qiemo Yudu Airport to Yanji Chaoyangchuan International Airport is 4 hours and 52 minutes.

Flight carbon footprint between Qiemo Yudu Airport (IQM) and Yanji Chaoyangchuan International Airport (YNJ)

On average, flying from Qiemo to Yanji generates about 253 kg of CO2 per passenger, and 253 kilograms equals 558 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Qiemo to Yanji

See the map of the shortest flight path between Qiemo Yudu Airport (IQM) and Yanji Chaoyangchuan International Airport (YNJ).

Airport information

Origin Qiemo Yudu Airport
City: Qiemo
Country: China Flag of China
IATA Code: IQM
ICAO Code: ZWCM
Coordinates: 38°8′57″N, 85°31′58″E
Destination Yanji Chaoyangchuan International Airport
City: Yanji
Country: China Flag of China
IATA Code: YNJ
ICAO Code: ZYYJ
Coordinates: 42°52′58″N, 129°27′3″E