Air Miles Calculator logo

How far is Baishan from Wuhai?

The distance between Wuhai (Wuhai Airport) and Baishan (Changbaishan Airport) is 1097 miles / 1766 kilometers / 953 nautical miles.

The driving distance from Wuhai (WUA) to Baishan (NBS) is 1339 miles / 2155 kilometers, and travel time by car is about 24 hours 37 minutes.

Wuhai Airport – Changbaishan Airport

Distance arrow
1097
Miles
Distance arrow
1766
Kilometers
Distance arrow
953
Nautical miles

Search flights

Distance from Wuhai to Baishan

There are several ways to calculate the distance from Wuhai to Baishan. Here are two standard methods:

Vincenty's formula (applied above)
  • 1097.163 miles
  • 1765.713 kilometers
  • 953.409 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1094.448 miles
  • 1761.344 kilometers
  • 951.050 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wuhai to Baishan?

The estimated flight time from Wuhai Airport to Changbaishan Airport is 2 hours and 34 minutes.

What is the time difference between Wuhai and Baishan?

There is no time difference between Wuhai and Baishan.

Flight carbon footprint between Wuhai Airport (WUA) and Changbaishan Airport (NBS)

On average, flying from Wuhai to Baishan generates about 157 kg of CO2 per passenger, and 157 kilograms equals 345 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Wuhai to Baishan

See the map of the shortest flight path between Wuhai Airport (WUA) and Changbaishan Airport (NBS).

Airport information

Origin Wuhai Airport
City: Wuhai
Country: China Flag of China
IATA Code: WUA
ICAO Code: ZBUH
Coordinates: 39°47′36″N, 106°47′57″E
Destination Changbaishan Airport
City: Baishan
Country: China Flag of China
IATA Code: NBS
ICAO Code: ZYBS
Coordinates: 42°4′0″N, 127°36′7″E