Air Miles Calculator logo

How far is Semey from Ejin Banner?

The distance between Ejin Banner (Ejin Banner Taolai Airport) and Semey (Semey Airport) is 1145 miles / 1843 kilometers / 995 nautical miles.

The driving distance from Ejin Banner (EJN) to Semey (PLX) is 1516 miles / 2439 kilometers, and travel time by car is about 28 hours 10 minutes.

Ejin Banner Taolai Airport – Semey Airport

Distance arrow
1145
Miles
Distance arrow
1843
Kilometers
Distance arrow
995
Nautical miles

Search flights

Distance from Ejin Banner to Semey

There are several ways to calculate the distance from Ejin Banner to Semey. Here are two standard methods:

Vincenty's formula (applied above)
  • 1144.910 miles
  • 1842.554 kilometers
  • 994.899 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1142.575 miles
  • 1838.796 kilometers
  • 992.871 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ejin Banner to Semey?

The estimated flight time from Ejin Banner Taolai Airport to Semey Airport is 2 hours and 40 minutes.

Flight carbon footprint between Ejin Banner Taolai Airport (EJN) and Semey Airport (PLX)

On average, flying from Ejin Banner to Semey generates about 159 kg of CO2 per passenger, and 159 kilograms equals 351 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ejin Banner to Semey

See the map of the shortest flight path between Ejin Banner Taolai Airport (EJN) and Semey Airport (PLX).

Airport information

Origin Ejin Banner Taolai Airport
City: Ejin Banner
Country: China Flag of China
IATA Code: EJN
ICAO Code: ZBEN
Coordinates: 42°0′55″N, 101°0′1″E
Destination Semey Airport
City: Semey
Country: Kazakhstan Flag of Kazakhstan
IATA Code: PLX
ICAO Code: UASS
Coordinates: 50°21′4″N, 80°14′3″E