Air Miles Calculator logo

How far is Montego Bay from Port of Spain?

The distance between Port of Spain (Piarco International Airport) and Montego Bay (Sangster International Airport) is 1235 miles / 1987 kilometers / 1073 nautical miles.

Piarco International Airport – Sangster International Airport

Distance arrow
1235
Miles
Distance arrow
1987
Kilometers
Distance arrow
1073
Nautical miles

Search flights

Distance from Port of Spain to Montego Bay

There are several ways to calculate the distance from Port of Spain to Montego Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 1234.887 miles
  • 1987.358 kilometers
  • 1073.087 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1234.752 miles
  • 1987.141 kilometers
  • 1072.970 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Port of Spain to Montego Bay?

The estimated flight time from Piarco International Airport to Sangster International Airport is 2 hours and 50 minutes.

Flight carbon footprint between Piarco International Airport (POS) and Sangster International Airport (MBJ)

On average, flying from Port of Spain to Montego Bay generates about 163 kg of CO2 per passenger, and 163 kilograms equals 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Port of Spain to Montego Bay

See the map of the shortest flight path between Piarco International Airport (POS) and Sangster International Airport (MBJ).

Airport information

Origin Piarco International Airport
City: Port of Spain
Country: Trinidad and Tobago Flag of Trinidad and Tobago
IATA Code: POS
ICAO Code: TTPP
Coordinates: 10°35′43″N, 61°20′13″W
Destination Sangster International Airport
City: Montego Bay
Country: Jamaica Flag of Jamaica
IATA Code: MBJ
ICAO Code: MKJS
Coordinates: 18°30′13″N, 77°54′48″W