Air Miles Calculator logo

How far is Basseterre from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Basseterre (Robert L. Bradshaw International Airport) is 2068 miles / 3329 kilometers / 1797 nautical miles.

Toronto Pearson International Airport – Robert L. Bradshaw International Airport

Distance arrow
2068
Miles
Distance arrow
3329
Kilometers
Distance arrow
1797
Nautical miles

Search flights

Distance from Toronto to Basseterre

There are several ways to calculate the distance from Toronto to Basseterre. Here are two standard methods:

Vincenty's formula (applied above)
  • 2068.358 miles
  • 3328.699 kilometers
  • 1797.354 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2072.098 miles
  • 3334.719 kilometers
  • 1800.604 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Basseterre?

The estimated flight time from Toronto Pearson International Airport to Robert L. Bradshaw International Airport is 4 hours and 24 minutes.

What is the time difference between Toronto and Basseterre?

There is no time difference between Toronto and Basseterre.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Robert L. Bradshaw International Airport (SKB)

On average, flying from Toronto to Basseterre generates about 225 kg of CO2 per passenger, and 225 kilograms equals 496 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Basseterre

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Robert L. Bradshaw International Airport (SKB).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Robert L. Bradshaw International Airport
City: Basseterre
Country: Saint Kitts and Nevis Flag of Saint Kitts and Nevis
IATA Code: SKB
ICAO Code: TKPK
Coordinates: 17°18′40″N, 62°43′7″W