Air Miles Calculator logo

How far is Uranium City from Norman Wells?

The distance between Norman Wells (Norman Wells Airport) and Uranium City (Uranium City Airport) is 705 miles / 1135 kilometers / 613 nautical miles.

Norman Wells Airport – Uranium City Airport

Distance arrow
705
Miles
Distance arrow
1135
Kilometers
Distance arrow
613
Nautical miles

Search flights

Distance from Norman Wells to Uranium City

There are several ways to calculate the distance from Norman Wells to Uranium City. Here are two standard methods:

Vincenty's formula (applied above)
  • 705.030 miles
  • 1134.636 kilometers
  • 612.654 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 702.712 miles
  • 1130.905 kilometers
  • 610.640 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Norman Wells to Uranium City?

The estimated flight time from Norman Wells Airport to Uranium City Airport is 1 hour and 50 minutes.

What is the time difference between Norman Wells and Uranium City?

There is no time difference between Norman Wells and Uranium City.

Flight carbon footprint between Norman Wells Airport (YVQ) and Uranium City Airport (YBE)

On average, flying from Norman Wells to Uranium City generates about 125 kg of CO2 per passenger, and 125 kilograms equals 276 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Norman Wells to Uranium City

See the map of the shortest flight path between Norman Wells Airport (YVQ) and Uranium City Airport (YBE).

Airport information

Origin Norman Wells Airport
City: Norman Wells
Country: Canada Flag of Canada
IATA Code: YVQ
ICAO Code: CYVQ
Coordinates: 65°16′53″N, 126°47′52″W
Destination Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W