Air Miles Calculator logo

How far is Baishan from Hami?

The distance between Hami (Hami Airport) and Baishan (Changbaishan Airport) is 1723 miles / 2774 kilometers / 1498 nautical miles.

The driving distance from Hami (HMI) to Baishan (NBS) is 2064 miles / 3321 kilometers, and travel time by car is about 37 hours 32 minutes.

Hami Airport – Changbaishan Airport

Distance arrow
1723
Miles
Distance arrow
2774
Kilometers
Distance arrow
1498
Nautical miles

Search flights

Distance from Hami to Baishan

There are several ways to calculate the distance from Hami to Baishan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1723.465 miles
  • 2773.647 kilometers
  • 1497.650 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1718.918 miles
  • 2766.330 kilometers
  • 1493.699 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hami to Baishan?

The estimated flight time from Hami Airport to Changbaishan Airport is 3 hours and 45 minutes.

What is the time difference between Hami and Baishan?

There is no time difference between Hami and Baishan.

Flight carbon footprint between Hami Airport (HMI) and Changbaishan Airport (NBS)

On average, flying from Hami to Baishan generates about 194 kg of CO2 per passenger, and 194 kilograms equals 428 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hami to Baishan

See the map of the shortest flight path between Hami Airport (HMI) and Changbaishan Airport (NBS).

Airport information

Origin Hami Airport
City: Hami
Country: China Flag of China
IATA Code: HMI
ICAO Code: ZWHM
Coordinates: 42°50′29″N, 93°40′9″E
Destination Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E