Air Miles Calculator logo

How far is Belfast from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Belfast (Belfast International Airport) is 3253 miles / 5235 kilometers / 2826 nautical miles.

Toronto Pearson International Airport – Belfast International Airport

Distance arrow
3253
Miles
Distance arrow
5235
Kilometers
Distance arrow
2826
Nautical miles

Search flights

Distance from Toronto to Belfast

There are several ways to calculate the distance from Toronto to Belfast. Here are two standard methods:

Vincenty's formula (applied above)
  • 3252.644 miles
  • 5234.623 kilometers
  • 2826.470 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3243.354 miles
  • 5219.672 kilometers
  • 2818.397 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Belfast?

The estimated flight time from Toronto Pearson International Airport to Belfast International Airport is 6 hours and 39 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Belfast International Airport (BFS)

On average, flying from Toronto to Belfast generates about 365 kg of CO2 per passenger, and 365 kilograms equals 804 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Belfast

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Belfast International Airport (BFS).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Belfast International Airport
City: Belfast
Country: United Kingdom Flag of United Kingdom
IATA Code: BFS
ICAO Code: EGAA
Coordinates: 54°39′27″N, 6°12′56″W