Air Miles Calculator logo

How far is Zhanjiang from Bavannur?

The distance between Bavannur (Bayannur Tianjitai Airport) and Zhanjiang (Zhanjiang Airport) is 1367 miles / 2199 kilometers / 1188 nautical miles.

The driving distance from Bavannur (RLK) to Zhanjiang (ZHA) is 1731 miles / 2786 kilometers, and travel time by car is about 31 hours 24 minutes.

Bayannur Tianjitai Airport – Zhanjiang Airport

Distance arrow
1367
Miles
Distance arrow
2199
Kilometers
Distance arrow
1188
Nautical miles

Search flights

Distance from Bavannur to Zhanjiang

There are several ways to calculate the distance from Bavannur to Zhanjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1366.675 miles
  • 2199.451 kilometers
  • 1187.608 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1370.526 miles
  • 2205.648 kilometers
  • 1190.955 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bavannur to Zhanjiang?

The estimated flight time from Bayannur Tianjitai Airport to Zhanjiang Airport is 3 hours and 5 minutes.

Flight carbon footprint between Bayannur Tianjitai Airport (RLK) and Zhanjiang Airport (ZHA)

On average, flying from Bavannur to Zhanjiang generates about 171 kg of CO2 per passenger, and 171 kilograms equals 377 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bavannur to Zhanjiang

See the map of the shortest flight path between Bayannur Tianjitai Airport (RLK) and Zhanjiang Airport (ZHA).

Airport information

Origin Bayannur Tianjitai Airport
City: Bavannur
Country: China Flag of China
IATA Code: RLK
ICAO Code: ZBYZ
Coordinates: 40°55′33″N, 107°44′34″E
Destination Zhanjiang Airport
City: Zhanjiang
Country: China Flag of China
IATA Code: ZHA
ICAO Code: ZGZJ
Coordinates: 21°12′51″N, 110°21′28″E