Air Miles Calculator logo

How far is Shihezi from Karamay?

The distance between Karamay (Karamay Airport) and Shihezi (Shihezi Huayuan Airport) is 107 miles / 172 kilometers / 93 nautical miles.

The driving distance from Karamay (KRY) to Shihezi (SHF) is 132 miles / 213 kilometers, and travel time by car is about 2 hours 42 minutes.

Karamay Airport – Shihezi Huayuan Airport

Distance arrow
107
Miles
Distance arrow
172
Kilometers
Distance arrow
93
Nautical miles

Search flights

Distance from Karamay to Shihezi

There are several ways to calculate the distance from Karamay to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 107.032 miles
  • 172.251 kilometers
  • 93.008 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 107.017 miles
  • 172.228 kilometers
  • 92.995 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Karamay to Shihezi?

The estimated flight time from Karamay Airport to Shihezi Huayuan Airport is 42 minutes.

What is the time difference between Karamay and Shihezi?

There is no time difference between Karamay and Shihezi.

Flight carbon footprint between Karamay Airport (KRY) and Shihezi Huayuan Airport (SHF)

On average, flying from Karamay to Shihezi generates about 41 kg of CO2 per passenger, and 41 kilograms equals 90 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Karamay to Shihezi

See the map of the shortest flight path between Karamay Airport (KRY) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Karamay Airport
City: Karamay
Country: China Flag of China
IATA Code: KRY
ICAO Code: ZWKM
Coordinates: 45°37′1″N, 84°52′58″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E