Air Miles Calculator logo

How far is Burlington, IA, from Punta Cana?

The distance between Punta Cana (Punta Cana International Airport) and Burlington (Southeast Iowa Regional Airport) is 2040 miles / 3284 kilometers / 1773 nautical miles.

Punta Cana International Airport – Southeast Iowa Regional Airport

Distance arrow
2040
Miles
Distance arrow
3284
Kilometers
Distance arrow
1773
Nautical miles

Search flights

Distance from Punta Cana to Burlington

There are several ways to calculate the distance from Punta Cana to Burlington. Here are two standard methods:

Vincenty's formula (applied above)
  • 2040.387 miles
  • 3283.684 kilometers
  • 1773.048 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2042.193 miles
  • 3286.591 kilometers
  • 1774.617 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Punta Cana to Burlington?

The estimated flight time from Punta Cana International Airport to Southeast Iowa Regional Airport is 4 hours and 21 minutes.

Flight carbon footprint between Punta Cana International Airport (PUJ) and Southeast Iowa Regional Airport (BRL)

On average, flying from Punta Cana to Burlington generates about 222 kg of CO2 per passenger, and 222 kilograms equals 490 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Punta Cana to Burlington

See the map of the shortest flight path between Punta Cana International Airport (PUJ) and Southeast Iowa Regional Airport (BRL).

Airport information

Origin Punta Cana International Airport
City: Punta Cana
Country: Dominican Republic Flag of Dominican Republic
IATA Code: PUJ
ICAO Code: MDPC
Coordinates: 18°34′2″N, 68°21′48″W
Destination Southeast Iowa Regional Airport
City: Burlington, IA
Country: United States Flag of United States
IATA Code: BRL
ICAO Code: KBRL
Coordinates: 40°46′59″N, 91°7′31″W