Air Miles Calculator logo

How far is Graciosa Island from St. John's?

The distance between St. John's (St. John's International Airport) and Graciosa Island (Graciosa Airport) is 1369 miles / 2204 kilometers / 1190 nautical miles.

St. John's International Airport – Graciosa Airport

Distance arrow
1369
Miles
Distance arrow
2204
Kilometers
Distance arrow
1190
Nautical miles
Flight time duration
3 h 5 min
Time Difference
2 h 30 min
CO2 emission
171 kg

Search flights

Distance from St. John's to Graciosa Island

There are several ways to calculate the distance from St. John's to Graciosa Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 1369.460 miles
  • 2203.932 kilometers
  • 1190.028 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1366.676 miles
  • 2199.451 kilometers
  • 1187.609 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. John's to Graciosa Island?

The estimated flight time from St. John's International Airport to Graciosa Airport is 3 hours and 5 minutes.

Flight carbon footprint between St. John's International Airport (YYT) and Graciosa Airport (GRW)

On average, flying from St. John's to Graciosa Island generates about 171 kg of CO2 per passenger, and 171 kilograms equals 378 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from St. John's to Graciosa Island

See the map of the shortest flight path between St. John's International Airport (YYT) and Graciosa Airport (GRW).

Airport information

Origin St. John's International Airport
City: St. John's
Country: Canada Flag of Canada
IATA Code: YYT
ICAO Code: CYYT
Coordinates: 47°37′6″N, 52°45′6″W
Destination Graciosa Airport
City: Graciosa Island
Country: Portugal Flag of Portugal
IATA Code: GRW
ICAO Code: LPGR
Coordinates: 39°5′31″N, 28°1′47″W