Air Miles Calculator logo

How far is Zhanjiang from Osaka?

The distance between Osaka (Kansai International Airport) and Zhanjiang (Zhanjiang Airport) is 1767 miles / 2844 kilometers / 1535 nautical miles.

The driving distance from Osaka (KIX) to Zhanjiang (ZHA) is 3063 miles / 4929 kilometers, and travel time by car is about 60 hours 21 minutes.

Kansai International Airport – Zhanjiang Airport

Distance arrow
1767
Miles
Distance arrow
2844
Kilometers
Distance arrow
1535
Nautical miles

Search flights

Distance from Osaka to Zhanjiang

There are several ways to calculate the distance from Osaka to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1766.886 miles
  • 2843.527 kilometers
  • 1535.382 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1766.079 miles
  • 2842.229 kilometers
  • 1534.681 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Osaka to Zhanjiang?

The estimated flight time from Kansai International Airport to Zhanjiang Airport is 3 hours and 50 minutes.

Flight carbon footprint between Kansai International Airport (KIX) and Zhanjiang Airport (ZHA)

On average, flying from Osaka to Zhanjiang generates about 198 kg of CO2 per passenger, and 198 kilograms equals 435 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Osaka to Zhanjiang

See the map of the shortest flight path between Kansai International Airport (KIX) and Zhanjiang Airport (ZHA).

Airport information

Origin Kansai International Airport
City: Osaka
Country: Japan Flag of Japan
IATA Code: KIX
ICAO Code: RJBB
Coordinates: 34°25′38″N, 135°14′38″E
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E