Air Miles Calculator logo

How far is Londrina from Palmas?

The distance between Palmas (Palmas Airport) and Londrina (Londrina Airport) is 915 miles / 1473 kilometers / 795 nautical miles.

The driving distance from Palmas (PMW) to Londrina (LDB) is 1110 miles / 1786 kilometers, and travel time by car is about 24 hours 5 minutes.

Palmas Airport – Londrina Airport

Distance arrow
915
Miles
Distance arrow
1473
Kilometers
Distance arrow
795
Nautical miles

Search flights

Distance from Palmas to Londrina

There are several ways to calculate the distance from Palmas to Londrina. Here are two standard methods:

Vincenty's formula (applied above)
  • 915.390 miles
  • 1473.177 kilometers
  • 795.452 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 919.499 miles
  • 1479.789 kilometers
  • 799.022 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Palmas to Londrina?

The estimated flight time from Palmas Airport to Londrina Airport is 2 hours and 13 minutes.

What is the time difference between Palmas and Londrina?

There is no time difference between Palmas and Londrina.

Flight carbon footprint between Palmas Airport (PMW) and Londrina Airport (LDB)

On average, flying from Palmas to Londrina generates about 145 kg of CO2 per passenger, and 145 kilograms equals 319 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Palmas to Londrina

See the map of the shortest flight path between Palmas Airport (PMW) and Londrina Airport (LDB).

Airport information

Origin Palmas Airport
City: Palmas
Country: Brazil Flag of Brazil
IATA Code: PMW
ICAO Code: SBPJ
Coordinates: 10°17′29″S, 48°21′25″W
Destination Londrina Airport
City: Londrina
Country: Brazil Flag of Brazil
IATA Code: LDB
ICAO Code: SBLO
Coordinates: 23°20′0″S, 51°7′48″W