Air Miles Calculator logo

How far is Campos from Houston, TX?

The distance between Houston (Houston George Bush Intercontinental Airport) and Campos (Bartolomeu Lysandro Airport) is 5054 miles / 8133 kilometers / 4392 nautical miles.

Houston George Bush Intercontinental Airport – Bartolomeu Lysandro Airport

Distance arrow
5054
Miles
Distance arrow
8133
Kilometers
Distance arrow
4392
Nautical miles

Search flights

Distance from Houston to Campos

There are several ways to calculate the distance from Houston to Campos. Here are two standard methods:

Vincenty's formula (applied above)
  • 5053.785 miles
  • 8133.278 kilometers
  • 4391.619 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5062.652 miles
  • 8147.549 kilometers
  • 4399.324 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Houston to Campos?

The estimated flight time from Houston George Bush Intercontinental Airport to Bartolomeu Lysandro Airport is 10 hours and 4 minutes.

Flight carbon footprint between Houston George Bush Intercontinental Airport (IAH) and Bartolomeu Lysandro Airport (CAW)

On average, flying from Houston to Campos generates about 591 kg of CO2 per passenger, and 591 kilograms equals 1 303 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Houston to Campos

See the map of the shortest flight path between Houston George Bush Intercontinental Airport (IAH) and Bartolomeu Lysandro Airport (CAW).

Airport information

Origin Houston George Bush Intercontinental Airport
City: Houston, TX
Country: United States Flag of United States
IATA Code: IAH
ICAO Code: KIAH
Coordinates: 29°59′3″N, 95°20′29″W
Destination Bartolomeu Lysandro Airport
City: Campos
Country: Brazil Flag of Brazil
IATA Code: CAW
ICAO Code: SBCP
Coordinates: 21°41′53″S, 41°18′6″W