Air Miles Calculator logo

How far is Prince George from Hopedale?

The distance between Hopedale (Hopedale Airport) and Prince George (Prince George Airport) is 2421 miles / 3896 kilometers / 2103 nautical miles.

The driving distance from Hopedale (YHO) to Prince George (YXS) is 4199 miles / 6757 kilometers, and travel time by car is about 106 hours 57 minutes.

Hopedale Airport – Prince George Airport

Distance arrow
2421
Miles
Distance arrow
3896
Kilometers
Distance arrow
2103
Nautical miles

Search flights

Distance from Hopedale to Prince George

There are several ways to calculate the distance from Hopedale to Prince George. Here are two standard methods:

Vincenty's formula (applied above)
  • 2420.647 miles
  • 3895.653 kilometers
  • 2103.485 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2412.567 miles
  • 3882.650 kilometers
  • 2096.464 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hopedale to Prince George?

The estimated flight time from Hopedale Airport to Prince George Airport is 5 hours and 4 minutes.

Flight carbon footprint between Hopedale Airport (YHO) and Prince George Airport (YXS)

On average, flying from Hopedale to Prince George generates about 266 kg of CO2 per passenger, and 266 kilograms equals 586 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Hopedale to Prince George

See the map of the shortest flight path between Hopedale Airport (YHO) and Prince George Airport (YXS).

Airport information

Origin Hopedale Airport
City: Hopedale
Country: Canada Flag of Canada
IATA Code: YHO
ICAO Code: CYHO
Coordinates: 55°26′53″N, 60°13′42″W
Destination Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W