Air Miles Calculator logo

How far is London from Ahmedabad?

The distance between Ahmedabad (Sardar Vallabhbhai Patel International Airport) and London (London International Airport) is 7587 miles / 12210 kilometers / 6593 nautical miles.

Sardar Vallabhbhai Patel International Airport – London International Airport

Distance arrow
7587
Miles
Distance arrow
12210
Kilometers
Distance arrow
6593
Nautical miles
Flight time duration
14 h 51 min
Time Difference
9 h 30 min
CO2 emission
938 kg

Search flights

Distance from Ahmedabad to London

There are several ways to calculate the distance from Ahmedabad to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 7586.807 miles
  • 12209.782 kilometers
  • 6592.755 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7574.064 miles
  • 12189.275 kilometers
  • 6581.682 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ahmedabad to London?

The estimated flight time from Sardar Vallabhbhai Patel International Airport to London International Airport is 14 hours and 51 minutes.

Flight carbon footprint between Sardar Vallabhbhai Patel International Airport (AMD) and London International Airport (YXU)

On average, flying from Ahmedabad to London generates about 938 kg of CO2 per passenger, and 938 kilograms equals 2 069 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ahmedabad to London

See the map of the shortest flight path between Sardar Vallabhbhai Patel International Airport (AMD) and London International Airport (YXU).

Airport information

Origin Sardar Vallabhbhai Patel International Airport
City: Ahmedabad
Country: India Flag of India
IATA Code: AMD
ICAO Code: VAAH
Coordinates: 23°4′37″N, 72°38′4″E
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W