Air Miles Calculator logo

How far is Harbin from Bijie?

The distance between Bijie (Bijie Feixiong Airport) and Harbin (Harbin Taiping International Airport) is 1705 miles / 2744 kilometers / 1481 nautical miles.

The driving distance from Bijie (BFJ) to Harbin (HRB) is 2046 miles / 3293 kilometers, and travel time by car is about 37 hours 30 minutes.

Bijie Feixiong Airport – Harbin Taiping International Airport

Distance arrow
1705
Miles
Distance arrow
2744
Kilometers
Distance arrow
1481
Nautical miles

Search flights

Distance from Bijie to Harbin

There are several ways to calculate the distance from Bijie to Harbin. Here are two standard methods:

Vincenty's formula (applied above)
  • 1704.769 miles
  • 2743.560 kilometers
  • 1481.404 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1704.929 miles
  • 2743.817 kilometers
  • 1481.543 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bijie to Harbin?

The estimated flight time from Bijie Feixiong Airport to Harbin Taiping International Airport is 3 hours and 43 minutes.

What is the time difference between Bijie and Harbin?

There is no time difference between Bijie and Harbin.

Flight carbon footprint between Bijie Feixiong Airport (BFJ) and Harbin Taiping International Airport (HRB)

On average, flying from Bijie to Harbin generates about 193 kg of CO2 per passenger, and 193 kilograms equals 425 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bijie to Harbin

See the map of the shortest flight path between Bijie Feixiong Airport (BFJ) and Harbin Taiping International Airport (HRB).

Airport information

Origin Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E
Destination Harbin Taiping International Airport
City: Harbin
Country: China Flag of China
IATA Code: HRB
ICAO Code: ZYHB
Coordinates: 45°37′24″N, 126°15′0″E