Air Miles Calculator logo

How far is London from Burlington, IA?

The distance between Burlington (Southeast Iowa Regional Airport) and London (London International Airport) is 537 miles / 864 kilometers / 466 nautical miles.

The driving distance from Burlington (BRL) to London (YXU) is 632 miles / 1017 kilometers, and travel time by car is about 11 hours 48 minutes.

Southeast Iowa Regional Airport – London International Airport

Distance arrow
537
Miles
Distance arrow
864
Kilometers
Distance arrow
466
Nautical miles

Search flights

Distance from Burlington to London

There are several ways to calculate the distance from Burlington to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 536.680 miles
  • 863.703 kilometers
  • 466.362 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 535.447 miles
  • 861.719 kilometers
  • 465.291 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Burlington to London?

The estimated flight time from Southeast Iowa Regional Airport to London International Airport is 1 hour and 30 minutes.

Flight carbon footprint between Southeast Iowa Regional Airport (BRL) and London International Airport (YXU)

On average, flying from Burlington to London generates about 104 kg of CO2 per passenger, and 104 kilograms equals 229 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Burlington to London

See the map of the shortest flight path between Southeast Iowa Regional Airport (BRL) and London International Airport (YXU).

Airport information

Origin Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W