Air Miles Calculator logo

How far is Blagoveschensk from Lhasa?

The distance between Lhasa (Lhasa Gonggar Airport) and Blagoveschensk (Ignatyevo Airport) is 2388 miles / 3844 kilometers / 2075 nautical miles.

The driving distance from Lhasa (LXA) to Blagoveschensk (BQS) is 3224 miles / 5188 kilometers, and travel time by car is about 60 hours 29 minutes.

Lhasa Gonggar Airport – Ignatyevo Airport

Distance arrow
2388
Miles
Distance arrow
3844
Kilometers
Distance arrow
2075
Nautical miles

Search flights

Distance from Lhasa to Blagoveschensk

There are several ways to calculate the distance from Lhasa to Blagoveschensk. Here are two standard methods:

Vincenty's formula (applied above)
  • 2388.378 miles
  • 3843.722 kilometers
  • 2075.444 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2385.975 miles
  • 3839.854 kilometers
  • 2073.355 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lhasa to Blagoveschensk?

The estimated flight time from Lhasa Gonggar Airport to Ignatyevo Airport is 5 hours and 1 minutes.

Flight carbon footprint between Lhasa Gonggar Airport (LXA) and Ignatyevo Airport (BQS)

On average, flying from Lhasa to Blagoveschensk generates about 262 kg of CO2 per passenger, and 262 kilograms equals 578 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lhasa to Blagoveschensk

See the map of the shortest flight path between Lhasa Gonggar Airport (LXA) and Ignatyevo Airport (BQS).

Airport information

Origin Lhasa Gonggar Airport
City: Lhasa
Country: China Flag of China
IATA Code: LXA
ICAO Code: ZULS
Coordinates: 29°17′52″N, 90°54′42″E
Destination Ignatyevo Airport
City: Blagoveschensk
Country: Russia Flag of Russia
IATA Code: BQS
ICAO Code: UHBB
Coordinates: 50°25′31″N, 127°24′43″E