Air Miles Calculator logo

How far is Meixian from Yingkou?

The distance between Yingkou (Yingkou Lanqi Airport) and Meixian (Meixian Airport) is 1173 miles / 1887 kilometers / 1019 nautical miles.

The driving distance from Yingkou (YKH) to Meixian (MXZ) is 1541 miles / 2480 kilometers, and travel time by car is about 28 hours 0 minutes.

Yingkou Lanqi Airport – Meixian Airport

Distance arrow
1173
Miles
Distance arrow
1887
Kilometers
Distance arrow
1019
Nautical miles

Search flights

Distance from Yingkou to Meixian

There are several ways to calculate the distance from Yingkou to Meixian. Here are two standard methods:

Vincenty's formula (applied above)
  • 1172.752 miles
  • 1887.361 kilometers
  • 1019.093 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1175.367 miles
  • 1891.571 kilometers
  • 1021.366 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yingkou to Meixian?

The estimated flight time from Yingkou Lanqi Airport to Meixian Airport is 2 hours and 43 minutes.

What is the time difference between Yingkou and Meixian?

There is no time difference between Yingkou and Meixian.

Flight carbon footprint between Yingkou Lanqi Airport (YKH) and Meixian Airport (MXZ)

On average, flying from Yingkou to Meixian generates about 160 kg of CO2 per passenger, and 160 kilograms equals 353 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yingkou to Meixian

See the map of the shortest flight path between Yingkou Lanqi Airport (YKH) and Meixian Airport (MXZ).

Airport information

Origin Yingkou Lanqi Airport
City: Yingkou
Country: China Flag of China
IATA Code: YKH
ICAO Code: ZYYK
Coordinates: 40°32′33″N, 122°21′30″E
Destination Meixian Airport
City: Meixian
Country: China Flag of China
IATA Code: MXZ
ICAO Code: ZGMX
Coordinates: 24°21′0″N, 116°7′58″E