Air Miles Calculator logo

Distance between Horta (HOR) and St. John's (YYT)

Flight distance from Horta to St. John's (Horta Airport – St. John's International Airport) is 1361 miles / 2191 kilometers / 1183 nautical miles. Estimated flight time is 3 hours 4 minutes.

Horta – St. John's

Distance arrow
1361
Miles
Distance arrow
2191
Kilometers
Distance arrow
1183
Nautical miles
Flight time duration
3 h 4 min
Time Difference
2 h 30 min
CO2 emission
171 kg

How far is St. John's from Horta?

There are several ways to calculate distances between Los Angeles and Chicago. Here are two common methods:

Vincenty's formula (applied above)
  • 1361.182 miles
  • 2190.610 kilometers
  • 1182.835 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
  • 1358.584 miles
  • 2186.429 kilometers
  • 1180.577 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Horta to St. John's?

Estimated flight time from Horta Airport to St. John's International Airport is 3 hours 4 minutes.

What is the time difference between Horta and St. John's?

The time difference between Horta and St. John's is 2 hours 30 minutes. St. John's is 2 hours 30 minutes behind Horta.
Horta time to St. John's time converter

Flight carbon footprint between Horta Airport (HOR) and St. John's International Airport (YYT)

On average flying from Horta to St. John's generates about 171 kg of CO2 per passenger, 171 kilograms is equal to 376 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Horta to St. John's

Shortest flight path between Horta Airport (HOR) and St. John's International Airport (YYT).

Airport information

Origin Horta Airport
City: Horta
Country: Portugal Flag of Portugal
IATA Code: HOR
ICAO Code: LPHR
Coordinates: 38°31′11″N, 28°42′57″W
Destination St. John's International Airport
City: St. John's
Country: Canada Flag of Canada
IATA Code: YYT
ICAO Code: CYYT
Coordinates: 47°37′6″N, 52°45′6″W