Air Miles Calculator logo

How far is Bisha from Los Angeles, CA?

The distance between Los Angeles (Los Angeles International Airport) and Bisha (Bisha Domestic Airport) is 8518 miles / 13709 kilometers / 7402 nautical miles.

Los Angeles International Airport – Bisha Domestic Airport

Distance arrow
8518
Miles
Distance arrow
13709
Kilometers
Distance arrow
7402
Nautical miles
Flight time duration
16 h 37 min
CO2 emission
1 075 kg

Search flights

Distance from Los Angeles to Bisha

There are several ways to calculate the distance from Los Angeles to Bisha. Here are two standard methods:

Vincenty's formula (applied above)
  • 8518.326 miles
  • 13708.917 kilometers
  • 7402.223 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8507.133 miles
  • 13690.904 kilometers
  • 7392.497 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Los Angeles to Bisha?

The estimated flight time from Los Angeles International Airport to Bisha Domestic Airport is 16 hours and 37 minutes.

Flight carbon footprint between Los Angeles International Airport (LAX) and Bisha Domestic Airport (BHH)

On average, flying from Los Angeles to Bisha generates about 1 075 kg of CO2 per passenger, and 1 075 kilograms equals 2 370 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Los Angeles to Bisha

See the map of the shortest flight path between Los Angeles International Airport (LAX) and Bisha Domestic Airport (BHH).

Airport information

Origin Los Angeles International Airport
City: Los Angeles, CA
Country: United States Flag of United States
IATA Code: LAX
ICAO Code: KLAX
Coordinates: 33°56′33″N, 118°24′28″W
Destination Bisha Domestic Airport
City: Bisha
Country: Saudi Arabia Flag of Saudi Arabia
IATA Code: BHH
ICAO Code: OEBH
Coordinates: 19°59′3″N, 42°37′15″E