Air Miles Calculator logo

How far is Jaguaruna from Manicoré?

The distance between Manicoré (Manicoré Airport) and Jaguaruna (Jaguaruna Regional Airport) is 1764 miles / 2840 kilometers / 1533 nautical miles.

The driving distance from Manicoré (MNX) to Jaguaruna (JJG) is 2581 miles / 4153 kilometers, and travel time by car is about 124 hours 18 minutes.

Manicoré Airport – Jaguaruna Regional Airport

Distance arrow
1764
Miles
Distance arrow
2840
Kilometers
Distance arrow
1533
Nautical miles

Search flights

Distance from Manicoré to Jaguaruna

There are several ways to calculate the distance from Manicoré to Jaguaruna. Here are two standard methods:

Vincenty's formula (applied above)
  • 1764.440 miles
  • 2839.591 kilometers
  • 1533.257 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1770.393 miles
  • 2849.172 kilometers
  • 1538.430 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Manicoré to Jaguaruna?

The estimated flight time from Manicoré Airport to Jaguaruna Regional Airport is 3 hours and 50 minutes.

Flight carbon footprint between Manicoré Airport (MNX) and Jaguaruna Regional Airport (JJG)

On average, flying from Manicoré to Jaguaruna generates about 197 kg of CO2 per passenger, and 197 kilograms equals 435 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Manicoré to Jaguaruna

See the map of the shortest flight path between Manicoré Airport (MNX) and Jaguaruna Regional Airport (JJG).

Airport information

Origin Manicoré Airport
City: Manicoré
Country: Brazil Flag of Brazil
IATA Code: MNX
ICAO Code: SBMY
Coordinates: 5°48′40″S, 61°16′41″W
Destination Jaguaruna Regional Airport
City: Jaguaruna
Country: Brazil Flag of Brazil
IATA Code: JJG
ICAO Code: SBJA
Coordinates: 28°40′31″S, 49°3′34″W