Air Miles Calculator logo

How far is Fort St.John from Charlo?

The distance between Charlo (Charlo Airport) and Fort St.John (Fort St. John Airport) is 2320 miles / 3733 kilometers / 2016 nautical miles.

The driving distance from Charlo (YCL) to Fort St.John (YXJ) is 3114 miles / 5012 kilometers, and travel time by car is about 63 hours 48 minutes.

Charlo Airport – Fort St. John Airport

Distance arrow
2320
Miles
Distance arrow
3733
Kilometers
Distance arrow
2016
Nautical miles

Search flights

Distance from Charlo to Fort St.John

There are several ways to calculate the distance from Charlo to Fort St.John. Here are two standard methods:

Vincenty's formula (applied above)
  • 2319.527 miles
  • 3732.917 kilometers
  • 2015.614 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2312.479 miles
  • 3721.574 kilometers
  • 2009.489 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Charlo to Fort St.John?

The estimated flight time from Charlo Airport to Fort St. John Airport is 4 hours and 53 minutes.

Flight carbon footprint between Charlo Airport (YCL) and Fort St. John Airport (YXJ)

On average, flying from Charlo to Fort St.John generates about 254 kg of CO2 per passenger, and 254 kilograms equals 560 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Charlo to Fort St.John

See the map of the shortest flight path between Charlo Airport (YCL) and Fort St. John Airport (YXJ).

Airport information

Origin Charlo Airport
City: Charlo
Country: Canada Flag of Canada
IATA Code: YCL
ICAO Code: CYCL
Coordinates: 47°59′26″N, 66°19′49″W
Destination Fort St. John Airport
City: Fort St.John
Country: Canada Flag of Canada
IATA Code: YXJ
ICAO Code: CYXJ
Coordinates: 56°14′17″N, 120°44′23″W