Air Miles Calculator logo

How far is St. Lewis from St. John's?

The distance between St. John's (St. John's International Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 353 miles / 569 kilometers / 307 nautical miles.

The driving distance from St. John's (YYT) to St. Lewis (YFX) is 744 miles / 1197 kilometers, and travel time by car is about 18 hours 33 minutes.

St. John's International Airport – St. Lewis (Fox Harbour) Airport

Distance arrow
353
Miles
Distance arrow
569
Kilometers
Distance arrow
307
Nautical miles

Search flights

Distance from St. John's to St. Lewis

There are several ways to calculate the distance from St. John's to St. Lewis. Here are two standard methods:

Vincenty's formula (applied above)
  • 353.358 miles
  • 568.675 kilometers
  • 307.060 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 353.119 miles
  • 568.289 kilometers
  • 306.852 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. John's to St. Lewis?

The estimated flight time from St. John's International Airport to St. Lewis (Fox Harbour) Airport is 1 hour and 10 minutes.

What is the time difference between St. John's and St. Lewis?

There is no time difference between St. John's and St. Lewis.

Flight carbon footprint between St. John's International Airport (YYT) and St. Lewis (Fox Harbour) Airport (YFX)

On average, flying from St. John's to St. Lewis generates about 77 kg of CO2 per passenger, and 77 kilograms equals 170 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from St. John's to St. Lewis

See the map of the shortest flight path between St. John's International Airport (YYT) and St. Lewis (Fox Harbour) Airport (YFX).

Airport information

Origin St. John's International Airport
City: St. John's
Country: Canada Flag of Canada
IATA Code: YYT
ICAO Code: CYYT
Coordinates: 47°37′6″N, 52°45′6″W
Destination St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W