Air Miles Calculator logo

How far is St. Lewis from Charlo?

The distance between Charlo (Charlo Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 561 miles / 902 kilometers / 487 nautical miles.

The driving distance from Charlo (YCL) to St. Lewis (YFX) is 1195 miles / 1923 kilometers, and travel time by car is about 30 hours 44 minutes.

Charlo Airport – St. Lewis (Fox Harbour) Airport

Distance arrow
561
Miles
Distance arrow
902
Kilometers
Distance arrow
487
Nautical miles
Flight time duration
1 h 33 min
CO2 emission
108 kg

Search flights

Distance from Charlo to St. Lewis

There are several ways to calculate the distance from Charlo to St. Lewis. Here are two standard methods:

Vincenty's formula (applied above)
  • 560.756 miles
  • 902.449 kilometers
  • 487.283 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 559.474 miles
  • 900.386 kilometers
  • 486.170 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charlo to St. Lewis?

The estimated flight time from Charlo Airport to St. Lewis (Fox Harbour) Airport is 1 hour and 33 minutes.

Flight carbon footprint between Charlo Airport (YCL) and St. Lewis (Fox Harbour) Airport (YFX)

On average, flying from Charlo to St. Lewis generates about 108 kg of CO2 per passenger, and 108 kilograms equals 237 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Charlo to St. Lewis

See the map of the shortest flight path between Charlo Airport (YCL) and St. Lewis (Fox Harbour) Airport (YFX).

Airport information

Origin Charlo Airport
City: Charlo
Country: Canada Flag of Canada
IATA Code: YCL
ICAO Code: CYCL
Coordinates: 47°59′26″N, 66°19′49″W
Destination St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W