Air Miles Calculator logo

How far is London from St. George's?

The distance between St. George's (Maurice Bishop International Airport) and London (London International Airport) is 2432 miles / 3914 kilometers / 2113 nautical miles.

Maurice Bishop International Airport – London International Airport

Distance arrow
2432
Miles
Distance arrow
3914
Kilometers
Distance arrow
2113
Nautical miles

Search flights

Distance from St. George's to London

There are several ways to calculate the distance from St. George's to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 2431.804 miles
  • 3913.609 kilometers
  • 2113.180 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2437.021 miles
  • 3922.006 kilometers
  • 2117.714 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from St. George's to London?

The estimated flight time from Maurice Bishop International Airport to London International Airport is 5 hours and 6 minutes.

What is the time difference between St. George's and London?

There is no time difference between St. George's and London.

Flight carbon footprint between Maurice Bishop International Airport (GND) and London International Airport (YXU)

On average, flying from St. George's to London generates about 267 kg of CO2 per passenger, and 267 kilograms equals 589 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from St. George's to London

See the map of the shortest flight path between Maurice Bishop International Airport (GND) and London International Airport (YXU).

Airport information

Origin Maurice Bishop International Airport
City: St. George's
Country: Grenada Flag of Grenada
IATA Code: GND
ICAO Code: TGPY
Coordinates: 12°0′15″N, 61°47′10″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W