Air Miles Calculator logo

How far is Saint John from Hamilton?

The distance between Hamilton (L.F. Wade International Airport) and Saint John (Saint John Airport) is 896 miles / 1442 kilometers / 778 nautical miles.

L.F. Wade International Airport – Saint John Airport

Distance arrow
896
Miles
Distance arrow
1442
Kilometers
Distance arrow
778
Nautical miles

Search flights

Distance from Hamilton to Saint John

There are several ways to calculate the distance from Hamilton to Saint John. Here are two standard methods:

Vincenty's formula (applied above)
  • 895.801 miles
  • 1441.652 kilometers
  • 778.430 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 897.248 miles
  • 1443.980 kilometers
  • 779.687 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamilton to Saint John?

The estimated flight time from L.F. Wade International Airport to Saint John Airport is 2 hours and 11 minutes.

What is the time difference between Hamilton and Saint John?

There is no time difference between Hamilton and Saint John.

Flight carbon footprint between L.F. Wade International Airport (BDA) and Saint John Airport (YSJ)

On average, flying from Hamilton to Saint John generates about 143 kg of CO2 per passenger, and 143 kilograms equals 316 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hamilton to Saint John

See the map of the shortest flight path between L.F. Wade International Airport (BDA) and Saint John Airport (YSJ).

Airport information

Origin L.F. Wade International Airport
City: Hamilton
Country: Bermuda Flag of Bermuda
IATA Code: BDA
ICAO Code: TXKF
Coordinates: 32°21′50″N, 64°40′43″W
Destination Saint John Airport
City: Saint John
Country: Canada Flag of Canada
IATA Code: YSJ
ICAO Code: CYSJ
Coordinates: 45°18′57″N, 65°53′25″W