Air Miles Calculator logo

How far is Golog from Yushu?

The distance between Yushu (Yushu Batang Airport) and Golog (Golog Maqin Airport) is 217 miles / 350 kilometers / 189 nautical miles.

The driving distance from Yushu (YUS) to Golog (GMQ) is 341 miles / 549 kilometers, and travel time by car is about 6 hours 31 minutes.

Yushu Batang Airport – Golog Maqin Airport

Distance arrow
217
Miles
Distance arrow
350
Kilometers
Distance arrow
189
Nautical miles

Search flights

Distance from Yushu to Golog

There are several ways to calculate the distance from Yushu to Golog. Here are two standard methods:

Vincenty's formula (applied above)
  • 217.497 miles
  • 350.027 kilometers
  • 188.999 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 217.286 miles
  • 349.687 kilometers
  • 188.816 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yushu to Golog?

The estimated flight time from Yushu Batang Airport to Golog Maqin Airport is 54 minutes.

What is the time difference between Yushu and Golog?

There is no time difference between Yushu and Golog.

Flight carbon footprint between Yushu Batang Airport (YUS) and Golog Maqin Airport (GMQ)

On average, flying from Yushu to Golog generates about 57 kg of CO2 per passenger, and 57 kilograms equals 126 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yushu to Golog

See the map of the shortest flight path between Yushu Batang Airport (YUS) and Golog Maqin Airport (GMQ).

Airport information

Origin Yushu Batang Airport
City: Yushu
Country: China Flag of China
IATA Code: YUS
ICAO Code: ZYLS
Coordinates: 32°50′11″N, 97°2′11″E
Destination Golog Maqin Airport
City: Golog
Country: China Flag of China
IATA Code: GMQ
ICAO Code: ZLGL
Coordinates: 34°25′5″N, 100°18′4″E