Air Miles Calculator logo

How far is Humberside from Papa Westray?

The distance between Papa Westray (Papa Westray Airport) and Humberside (Humberside Airport) is 411 miles / 662 kilometers / 358 nautical miles.

The driving distance from Papa Westray (PPW) to Humberside (HUY) is 618 miles / 994 kilometers, and travel time by car is about 19 hours 39 minutes.

Papa Westray Airport – Humberside Airport

Distance arrow
411
Miles
Distance arrow
662
Kilometers
Distance arrow
358
Nautical miles

Search flights

Distance from Papa Westray to Humberside

There are several ways to calculate the distance from Papa Westray to Humberside. Here are two standard methods:

Vincenty's formula (applied above)
  • 411.406 miles
  • 662.094 kilometers
  • 357.502 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 410.787 miles
  • 661.098 kilometers
  • 356.964 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Papa Westray to Humberside?

The estimated flight time from Papa Westray Airport to Humberside Airport is 1 hour and 16 minutes.

What is the time difference between Papa Westray and Humberside?

There is no time difference between Papa Westray and Humberside.

Flight carbon footprint between Papa Westray Airport (PPW) and Humberside Airport (HUY)

On average, flying from Papa Westray to Humberside generates about 86 kg of CO2 per passenger, and 86 kilograms equals 189 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Papa Westray to Humberside

See the map of the shortest flight path between Papa Westray Airport (PPW) and Humberside Airport (HUY).

Airport information

Origin Papa Westray Airport
City: Papa Westray
Country: United Kingdom Flag of United Kingdom
IATA Code: PPW
ICAO Code: EGEP
Coordinates: 59°21′6″N, 2°54′1″W
Destination Humberside Airport
City: Humberside
Country: United Kingdom Flag of United Kingdom
IATA Code: HUY
ICAO Code: EGNJ
Coordinates: 53°34′27″N, 0°21′2″W