Air Miles Calculator logo

How far is Kitchener from Hamilton?

The distance between Hamilton (L.F. Wade International Airport) and Kitchener (Region of Waterloo International Airport) is 1146 miles / 1844 kilometers / 996 nautical miles.

L.F. Wade International Airport – Region of Waterloo International Airport

Distance arrow
1146
Miles
Distance arrow
1844
Kilometers
Distance arrow
996
Nautical miles

Search flights

Distance from Hamilton to Kitchener

There are several ways to calculate the distance from Hamilton to Kitchener. Here are two standard methods:

Vincenty's formula (applied above)
  • 1146.046 miles
  • 1844.383 kilometers
  • 995.887 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1145.459 miles
  • 1843.437 kilometers
  • 995.376 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hamilton to Kitchener?

The estimated flight time from L.F. Wade International Airport to Region of Waterloo International Airport is 2 hours and 40 minutes.

Flight carbon footprint between L.F. Wade International Airport (BDA) and Region of Waterloo International Airport (YKF)

On average, flying from Hamilton to Kitchener generates about 159 kg of CO2 per passenger, and 159 kilograms equals 351 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hamilton to Kitchener

See the map of the shortest flight path between L.F. Wade International Airport (BDA) and Region of Waterloo International Airport (YKF).

Airport information

Origin L.F. Wade International Airport
City: Hamilton
Country: Bermuda Flag of Bermuda
IATA Code: BDA
ICAO Code: TXKF
Coordinates: 32°21′50″N, 64°40′43″W
Destination Region of Waterloo International Airport
City: Kitchener
Country: Canada Flag of Canada
IATA Code: YKF
ICAO Code: CYKF
Coordinates: 43°27′38″N, 80°22′42″W