Air Miles Calculator logo

How far is Batagay from Tyumen?

The distance between Tyumen (Roshchino International Airport) and Batagay (Batagay Airport) is 2205 miles / 3548 kilometers / 1916 nautical miles.

The driving distance from Tyumen (TJM) to Batagay (BQJ) is 4643 miles / 7472 kilometers, and travel time by car is about 122 hours 31 minutes.

Roshchino International Airport – Batagay Airport

Distance arrow
2205
Miles
Distance arrow
3548
Kilometers
Distance arrow
1916
Nautical miles

Search flights

Distance from Tyumen to Batagay

There are several ways to calculate the distance from Tyumen to Batagay. Here are two standard methods:

Vincenty's formula (applied above)
  • 2204.828 miles
  • 3548.327 kilometers
  • 1915.944 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2196.955 miles
  • 3535.657 kilometers
  • 1909.102 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tyumen to Batagay?

The estimated flight time from Roshchino International Airport to Batagay Airport is 4 hours and 40 minutes.

Flight carbon footprint between Roshchino International Airport (TJM) and Batagay Airport (BQJ)

On average, flying from Tyumen to Batagay generates about 241 kg of CO2 per passenger, and 241 kilograms equals 531 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tyumen to Batagay

See the map of the shortest flight path between Roshchino International Airport (TJM) and Batagay Airport (BQJ).

Airport information

Origin Roshchino International Airport
City: Tyumen
Country: Russia Flag of Russia
IATA Code: TJM
ICAO Code: USTR
Coordinates: 57°11′22″N, 65°19′27″E
Destination Batagay Airport
City: Batagay
Country: Russia Flag of Russia
IATA Code: BQJ
ICAO Code: UEBB
Coordinates: 67°38′52″N, 134°41′42″E