Air Miles Calculator logo

How far is St. Lewis from Poplar Hill?

The distance between Poplar Hill (Poplar Hill Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 1618 miles / 2604 kilometers / 1406 nautical miles.

The driving distance from Poplar Hill (YHP) to St. Lewis (YFX) is 2648 miles / 4261 kilometers, and travel time by car is about 66 hours 22 minutes.

Poplar Hill Airport – St. Lewis (Fox Harbour) Airport

Distance arrow
1618
Miles
Distance arrow
2604
Kilometers
Distance arrow
1406
Nautical miles
Flight time duration
3 h 33 min
Time Difference
2 h 30 min
CO2 emission
187 kg

Search flights

Distance from Poplar Hill to St. Lewis

There are several ways to calculate the distance from Poplar Hill to St. Lewis. Here are two standard methods:

Vincenty's formula (applied above)
  • 1618.017 miles
  • 2603.945 kilometers
  • 1406.018 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1612.825 miles
  • 2595.589 kilometers
  • 1401.506 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Poplar Hill to St. Lewis?

The estimated flight time from Poplar Hill Airport to St. Lewis (Fox Harbour) Airport is 3 hours and 33 minutes.

Flight carbon footprint between Poplar Hill Airport (YHP) and St. Lewis (Fox Harbour) Airport (YFX)

On average, flying from Poplar Hill to St. Lewis generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Poplar Hill to St. Lewis

See the map of the shortest flight path between Poplar Hill Airport (YHP) and St. Lewis (Fox Harbour) Airport (YFX).

Airport information

Origin Poplar Hill Airport
City: Poplar Hill
Country: Canada Flag of Canada
IATA Code: YHP
ICAO Code: CPV7
Coordinates: 52°6′47″N, 94°15′20″W
Destination St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W