Air Miles Calculator logo

How far is Lijiang from Maiduguri?

The distance between Maiduguri (Maiduguri International Airport) and Lijiang (Lijiang Sanyi International Airport) is 5688 miles / 9153 kilometers / 4942 nautical miles.

Maiduguri International Airport – Lijiang Sanyi International Airport

Distance arrow
5688
Miles
Distance arrow
9153
Kilometers
Distance arrow
4942
Nautical miles

Search flights

Distance from Maiduguri to Lijiang

There are several ways to calculate the distance from Maiduguri to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 5687.642 miles
  • 9153.373 kilometers
  • 4942.426 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5680.364 miles
  • 9141.659 kilometers
  • 4936.101 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Maiduguri to Lijiang?

The estimated flight time from Maiduguri International Airport to Lijiang Sanyi International Airport is 11 hours and 16 minutes.

Flight carbon footprint between Maiduguri International Airport (MIU) and Lijiang Sanyi International Airport (LJG)

On average, flying from Maiduguri to Lijiang generates about 675 kg of CO2 per passenger, and 675 kilograms equals 1 487 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Maiduguri to Lijiang

See the map of the shortest flight path between Maiduguri International Airport (MIU) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Maiduguri International Airport
City: Maiduguri
Country: Nigeria Flag of Nigeria
IATA Code: MIU
ICAO Code: DNMA
Coordinates: 11°51′19″N, 13°4′51″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E