Air Miles Calculator logo

How far is Shihezi from Luoyang?

The distance between Luoyang (Luoyang Beijiao Airport) and Shihezi (Shihezi Huayuan Airport) is 1551 miles / 2497 kilometers / 1348 nautical miles.

The driving distance from Luoyang (LYA) to Shihezi (SHF) is 1864 miles / 3000 kilometers, and travel time by car is about 33 hours 39 minutes.

Luoyang Beijiao Airport – Shihezi Huayuan Airport

Distance arrow
1551
Miles
Distance arrow
2497
Kilometers
Distance arrow
1348
Nautical miles

Search flights

Distance from Luoyang to Shihezi

There are several ways to calculate the distance from Luoyang to Shihezi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1551.479 miles
  • 2496.864 kilometers
  • 1348.199 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1548.761 miles
  • 2492.490 kilometers
  • 1345.837 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luoyang to Shihezi?

The estimated flight time from Luoyang Beijiao Airport to Shihezi Huayuan Airport is 3 hours and 26 minutes.

Flight carbon footprint between Luoyang Beijiao Airport (LYA) and Shihezi Huayuan Airport (SHF)

On average, flying from Luoyang to Shihezi generates about 183 kg of CO2 per passenger, and 183 kilograms equals 403 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luoyang to Shihezi

See the map of the shortest flight path between Luoyang Beijiao Airport (LYA) and Shihezi Huayuan Airport (SHF).

Airport information

Origin Luoyang Beijiao Airport
City: Luoyang
Country: China Flag of China
IATA Code: LYA
ICAO Code: ZHLY
Coordinates: 34°44′27″N, 112°23′16″E
Destination Shihezi Huayuan Airport
City: Shihezi
Country: China Flag of China
IATA Code: SHF
ICAO Code: ZWHZ
Coordinates: 44°14′31″N, 85°53′25″E