Air Miles Calculator logo

How far is Fort St.John from Nakina?

The distance between Nakina (Nakina Airport) and Fort St.John (Fort St. John Airport) is 1457 miles / 2345 kilometers / 1266 nautical miles.

The driving distance from Nakina (YQN) to Fort St.John (YXJ) is 1880 miles / 3026 kilometers, and travel time by car is about 38 hours 27 minutes.

Nakina Airport – Fort St. John Airport

Distance arrow
1457
Miles
Distance arrow
2345
Kilometers
Distance arrow
1266
Nautical miles

Search flights

Distance from Nakina to Fort St.John

There are several ways to calculate the distance from Nakina to Fort St.John. Here are two standard methods:

Vincenty's formula (applied above)
  • 1457.086 miles
  • 2344.953 kilometers
  • 1266.173 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1452.627 miles
  • 2337.777 kilometers
  • 1262.299 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nakina to Fort St.John?

The estimated flight time from Nakina Airport to Fort St. John Airport is 3 hours and 15 minutes.

Flight carbon footprint between Nakina Airport (YQN) and Fort St. John Airport (YXJ)

On average, flying from Nakina to Fort St.John generates about 177 kg of CO2 per passenger, and 177 kilograms equals 390 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nakina to Fort St.John

See the map of the shortest flight path between Nakina Airport (YQN) and Fort St. John Airport (YXJ).

Airport information

Origin Nakina Airport
City: Nakina
Country: Canada Flag of Canada
IATA Code: YQN
ICAO Code: CYQN
Coordinates: 50°10′58″N, 86°41′47″W
Destination Fort St. John Airport
City: Fort St.John
Country: Canada Flag of Canada
IATA Code: YXJ
ICAO Code: CYXJ
Coordinates: 56°14′17″N, 120°44′23″W