Air Miles Calculator logo

How far is Jixi from Ejin Banner?

The distance between Ejin Banner (Ejin Banner Taolai Airport) and Jixi (Jixi Xingkaihu Airport) is 1521 miles / 2448 kilometers / 1322 nautical miles.

The driving distance from Ejin Banner (EJN) to Jixi (JXA) is 1887 miles / 3037 kilometers, and travel time by car is about 34 hours 34 minutes.

Ejin Banner Taolai Airport – Jixi Xingkaihu Airport

Distance arrow
1521
Miles
Distance arrow
2448
Kilometers
Distance arrow
1322
Nautical miles

Search flights

Distance from Ejin Banner to Jixi

There are several ways to calculate the distance from Ejin Banner to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1521.135 miles
  • 2448.029 kilometers
  • 1321.830 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1517.130 miles
  • 2441.583 kilometers
  • 1318.350 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ejin Banner to Jixi?

The estimated flight time from Ejin Banner Taolai Airport to Jixi Xingkaihu Airport is 3 hours and 22 minutes.

What is the time difference between Ejin Banner and Jixi?

There is no time difference between Ejin Banner and Jixi.

Flight carbon footprint between Ejin Banner Taolai Airport (EJN) and Jixi Xingkaihu Airport (JXA)

On average, flying from Ejin Banner to Jixi generates about 181 kg of CO2 per passenger, and 181 kilograms equals 399 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ejin Banner to Jixi

See the map of the shortest flight path between Ejin Banner Taolai Airport (EJN) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Ejin Banner Taolai Airport
City: Ejin Banner
Country: China Flag of China
IATA Code: EJN
ICAO Code: ZBEN
Coordinates: 42°0′55″N, 101°0′1″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E