Air Miles Calculator logo

How far is Aasiaat from Neerlerit Inaat?

The distance between Neerlerit Inaat (Nerlerit Inaat Airport) and Aasiaat (Aasiaat Airport) is 730 miles / 1174 kilometers / 634 nautical miles.

Nerlerit Inaat Airport – Aasiaat Airport

Distance arrow
730
Miles
Distance arrow
1174
Kilometers
Distance arrow
634
Nautical miles

Search flights

Distance from Neerlerit Inaat to Aasiaat

There are several ways to calculate the distance from Neerlerit Inaat to Aasiaat. Here are two standard methods:

Vincenty's formula (applied above)
  • 729.559 miles
  • 1174.111 kilometers
  • 633.969 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 726.616 miles
  • 1169.376 kilometers
  • 631.412 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Neerlerit Inaat to Aasiaat?

The estimated flight time from Nerlerit Inaat Airport to Aasiaat Airport is 1 hour and 52 minutes.

What is the time difference between Neerlerit Inaat and Aasiaat?

There is no time difference between Neerlerit Inaat and Aasiaat.

Flight carbon footprint between Nerlerit Inaat Airport (CNP) and Aasiaat Airport (JEG)

On average, flying from Neerlerit Inaat to Aasiaat generates about 128 kg of CO2 per passenger, and 128 kilograms equals 282 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Neerlerit Inaat to Aasiaat

See the map of the shortest flight path between Nerlerit Inaat Airport (CNP) and Aasiaat Airport (JEG).

Airport information

Origin Nerlerit Inaat Airport
City: Neerlerit Inaat
Country: Greenland Flag of Greenland
IATA Code: CNP
ICAO Code: BGCO
Coordinates: 70°44′35″N, 22°39′1″W
Destination Aasiaat Airport
City: Aasiaat
Country: Greenland Flag of Greenland
IATA Code: JEG
ICAO Code: BGAA
Coordinates: 68°43′18″N, 52°47′4″W