Air Miles Calculator logo

How far is Puerto Plata from Punta Cana?

The distance between Punta Cana (Punta Cana International Airport) and Puerto Plata (Gregorio Luperón International Airport) is 166 miles / 267 kilometers / 144 nautical miles.

The driving distance from Punta Cana (PUJ) to Puerto Plata (POP) is 242 miles / 390 kilometers, and travel time by car is about 5 hours 23 minutes.

Punta Cana International Airport – Gregorio Luperón International Airport

Distance arrow
166
Miles
Distance arrow
267
Kilometers
Distance arrow
144
Nautical miles

Search flights

Distance from Punta Cana to Puerto Plata

There are several ways to calculate the distance from Punta Cana to Puerto Plata. Here are two standard methods:

Vincenty's formula (applied above)
  • 165.847 miles
  • 266.905 kilometers
  • 144.117 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 165.845 miles
  • 266.902 kilometers
  • 144.115 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Punta Cana to Puerto Plata?

The estimated flight time from Punta Cana International Airport to Gregorio Luperón International Airport is 48 minutes.

What is the time difference between Punta Cana and Puerto Plata?

There is no time difference between Punta Cana and Puerto Plata.

Flight carbon footprint between Punta Cana International Airport (PUJ) and Gregorio Luperón International Airport (POP)

On average, flying from Punta Cana to Puerto Plata generates about 49 kg of CO2 per passenger, and 49 kilograms equals 109 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Punta Cana to Puerto Plata

See the map of the shortest flight path between Punta Cana International Airport (PUJ) and Gregorio Luperón International Airport (POP).

Airport information

Origin Punta Cana International Airport
City: Punta Cana
Country: Dominican Republic Flag of Dominican Republic
IATA Code: PUJ
ICAO Code: MDPC
Coordinates: 18°34′2″N, 68°21′48″W
Destination Gregorio Luperón International Airport
City: Puerto Plata
Country: Dominican Republic Flag of Dominican Republic
IATA Code: POP
ICAO Code: MDPP
Coordinates: 19°45′28″N, 70°34′11″W