Air Miles Calculator logo

How far is Ponta Grossa from Charlotte, NC?

The distance between Charlotte (Charlotte Douglas International Airport) and Ponta Grossa (Ponta Grossa Airport) is 4618 miles / 7431 kilometers / 4013 nautical miles.

Charlotte Douglas International Airport – Ponta Grossa Airport

Distance arrow
4618
Miles
Distance arrow
7431
Kilometers
Distance arrow
4013
Nautical miles

Search flights

Distance from Charlotte to Ponta Grossa

There are several ways to calculate the distance from Charlotte to Ponta Grossa. Here are two standard methods:

Vincenty's formula (applied above)
  • 4617.529 miles
  • 7431.192 kilometers
  • 4012.523 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4633.729 miles
  • 7457.264 kilometers
  • 4026.600 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charlotte to Ponta Grossa?

The estimated flight time from Charlotte Douglas International Airport to Ponta Grossa Airport is 9 hours and 14 minutes.

Flight carbon footprint between Charlotte Douglas International Airport (CLT) and Ponta Grossa Airport (PGZ)

On average, flying from Charlotte to Ponta Grossa generates about 534 kg of CO2 per passenger, and 534 kilograms equals 1 178 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Charlotte to Ponta Grossa

See the map of the shortest flight path between Charlotte Douglas International Airport (CLT) and Ponta Grossa Airport (PGZ).

Airport information

Origin Charlotte Douglas International Airport
City: Charlotte, NC
Country: United States Flag of United States
IATA Code: CLT
ICAO Code: KCLT
Coordinates: 35°12′50″N, 80°56′35″W
Destination Ponta Grossa Airport
City: Ponta Grossa
Country: Brazil Flag of Brazil
IATA Code: PGZ
ICAO Code: SBPG
Coordinates: 25°11′4″S, 50°8′38″W