Air Miles Calculator logo

How far is Westray from Luqa?

The distance between Luqa (Malta International Airport) and Westray (Westray Airport) is 1803 miles / 2902 kilometers / 1567 nautical miles.

The driving distance from Luqa (MLA) to Westray (WRY) is 2492 miles / 4010 kilometers, and travel time by car is about 53 hours 2 minutes.

Malta International Airport – Westray Airport

Distance arrow
1803
Miles
Distance arrow
2902
Kilometers
Distance arrow
1567
Nautical miles

Search flights

Distance from Luqa to Westray

There are several ways to calculate the distance from Luqa to Westray. Here are two standard methods:

Vincenty's formula (applied above)
  • 1803.094 miles
  • 2901.799 kilometers
  • 1566.846 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1802.288 miles
  • 2900.502 kilometers
  • 1566.146 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Luqa to Westray?

The estimated flight time from Malta International Airport to Westray Airport is 3 hours and 54 minutes.

Flight carbon footprint between Malta International Airport (MLA) and Westray Airport (WRY)

On average, flying from Luqa to Westray generates about 200 kg of CO2 per passenger, and 200 kilograms equals 442 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Luqa to Westray

See the map of the shortest flight path between Malta International Airport (MLA) and Westray Airport (WRY).

Airport information

Origin Malta International Airport
City: Luqa
Country: Malta Flag of Malta
IATA Code: MLA
ICAO Code: LMML
Coordinates: 35°51′26″N, 14°28′39″E
Destination Westray Airport
City: Westray
Country: United Kingdom Flag of United Kingdom
IATA Code: WRY
ICAO Code: EGEW
Coordinates: 59°21′1″N, 2°57′0″W