Air Miles Calculator logo

How far is Jixi from Lianyungang?

The distance between Lianyungang (Lianyungang Baitabu Airport) and Jixi (Jixi Xingkaihu Airport) is 973 miles / 1567 kilometers / 846 nautical miles.

The driving distance from Lianyungang (LYG) to Jixi (JXA) is 1336 miles / 2150 kilometers, and travel time by car is about 24 hours 2 minutes.

Lianyungang Baitabu Airport – Jixi Xingkaihu Airport

Distance arrow
973
Miles
Distance arrow
1567
Kilometers
Distance arrow
846
Nautical miles

Search flights

Distance from Lianyungang to Jixi

There are several ways to calculate the distance from Lianyungang to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 973.451 miles
  • 1566.617 kilometers
  • 845.905 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 973.254 miles
  • 1566.301 kilometers
  • 845.735 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lianyungang to Jixi?

The estimated flight time from Lianyungang Baitabu Airport to Jixi Xingkaihu Airport is 2 hours and 20 minutes.

What is the time difference between Lianyungang and Jixi?

There is no time difference between Lianyungang and Jixi.

Flight carbon footprint between Lianyungang Baitabu Airport (LYG) and Jixi Xingkaihu Airport (JXA)

On average, flying from Lianyungang to Jixi generates about 149 kg of CO2 per passenger, and 149 kilograms equals 328 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lianyungang to Jixi

See the map of the shortest flight path between Lianyungang Baitabu Airport (LYG) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Lianyungang Baitabu Airport
City: Lianyungang
Country: China Flag of China
IATA Code: LYG
ICAO Code: ZSLG
Coordinates: 34°32′59″N, 119°15′0″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E