Air Miles Calculator logo

How far is Lijiang from Zhanjiang?

The distance between Zhanjiang (Zhanjiang Airport) and Lijiang (Lijiang Sanyi International Airport) is 742 miles / 1193 kilometers / 644 nautical miles.

The driving distance from Zhanjiang (ZHA) to Lijiang (LJG) is 963 miles / 1550 kilometers, and travel time by car is about 17 hours 21 minutes.

Zhanjiang Airport – Lijiang Sanyi International Airport

Distance arrow
742
Miles
Distance arrow
1193
Kilometers
Distance arrow
644
Nautical miles

Search flights

Distance from Zhanjiang to Lijiang

There are several ways to calculate the distance from Zhanjiang to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 741.540 miles
  • 1193.392 kilometers
  • 644.380 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 741.375 miles
  • 1193.127 kilometers
  • 644.237 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Zhanjiang to Lijiang?

The estimated flight time from Zhanjiang Airport to Lijiang Sanyi International Airport is 1 hour and 54 minutes.

Flight carbon footprint between Zhanjiang Airport (ZHA) and Lijiang Sanyi International Airport (LJG)

On average, flying from Zhanjiang to Lijiang generates about 129 kg of CO2 per passenger, and 129 kilograms equals 284 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Zhanjiang to Lijiang

See the map of the shortest flight path between Zhanjiang Airport (ZHA) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E