Air Miles Calculator logo

How far is Bijie from Yulin?

The distance between Yulin (Yulin Yuyang Airport) and Bijie (Bijie Feixiong Airport) is 797 miles / 1283 kilometers / 693 nautical miles.

The driving distance from Yulin (UYN) to Bijie (BFJ) is 1020 miles / 1642 kilometers, and travel time by car is about 18 hours 40 minutes.

Yulin Yuyang Airport – Bijie Feixiong Airport

Distance arrow
797
Miles
Distance arrow
1283
Kilometers
Distance arrow
693
Nautical miles

Search flights

Distance from Yulin to Bijie

There are several ways to calculate the distance from Yulin to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 797.415 miles
  • 1283.315 kilometers
  • 692.935 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 799.162 miles
  • 1286.126 kilometers
  • 694.453 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Yulin to Bijie?

The estimated flight time from Yulin Yuyang Airport to Bijie Feixiong Airport is 2 hours and 0 minutes.

What is the time difference between Yulin and Bijie?

There is no time difference between Yulin and Bijie.

Flight carbon footprint between Yulin Yuyang Airport (UYN) and Bijie Feixiong Airport (BFJ)

On average, flying from Yulin to Bijie generates about 135 kg of CO2 per passenger, and 135 kilograms equals 297 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Yulin to Bijie

See the map of the shortest flight path between Yulin Yuyang Airport (UYN) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Yulin Yuyang Airport
City: Yulin
Country: China Flag of China
IATA Code: UYN
ICAO Code: ZLYL
Coordinates: 38°16′9″N, 109°43′51″E
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E