Air Miles Calculator logo

How far is St. Lewis from Oxford House?

The distance between Oxford House (Oxford House Airport) and St. Lewis (St. Lewis (Fox Harbour) Airport) is 1615 miles / 2599 kilometers / 1403 nautical miles.

Oxford House Airport – St. Lewis (Fox Harbour) Airport

Distance arrow
1615
Miles
Distance arrow
2599
Kilometers
Distance arrow
1403
Nautical miles
Flight time duration
3 h 33 min
Time Difference
2 h 30 min
CO2 emission
187 kg

Search flights

Distance from Oxford House to St. Lewis

There are several ways to calculate the distance from Oxford House to St. Lewis. Here are two standard methods:

Vincenty's formula (applied above)
  • 1614.735 miles
  • 2598.665 kilometers
  • 1403.167 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1609.473 miles
  • 2590.195 kilometers
  • 1398.594 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Oxford House to St. Lewis?

The estimated flight time from Oxford House Airport to St. Lewis (Fox Harbour) Airport is 3 hours and 33 minutes.

Flight carbon footprint between Oxford House Airport (YOH) and St. Lewis (Fox Harbour) Airport (YFX)

On average, flying from Oxford House to St. Lewis generates about 187 kg of CO2 per passenger, and 187 kilograms equals 412 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Oxford House to St. Lewis

See the map of the shortest flight path between Oxford House Airport (YOH) and St. Lewis (Fox Harbour) Airport (YFX).

Airport information

Origin Oxford House Airport
City: Oxford House
Country: Canada Flag of Canada
IATA Code: YOH
ICAO Code: CYOH
Coordinates: 54°55′59″N, 95°16′44″W
Destination St. Lewis (Fox Harbour) Airport
City: St. Lewis
Country: Canada Flag of Canada
IATA Code: YFX
ICAO Code: CCK4
Coordinates: 52°22′22″N, 55°40′26″W