Air Miles Calculator logo

How far is Bar Harbor, ME, from London?

The distance between London (London International Airport) and Bar Harbor (Hancock County–Bar Harbor Airport) is 647 miles / 1041 kilometers / 562 nautical miles.

The driving distance from London (YXU) to Bar Harbor (BHB) is 766 miles / 1233 kilometers, and travel time by car is about 16 hours 43 minutes.

London International Airport – Hancock County–Bar Harbor Airport

Distance arrow
647
Miles
Distance arrow
1041
Kilometers
Distance arrow
562
Nautical miles

Search flights

Distance from London to Bar Harbor

There are several ways to calculate the distance from London to Bar Harbor. Here are two standard methods:

Vincenty's formula (applied above)
  • 647.006 miles
  • 1041.255 kilometers
  • 562.233 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 645.299 miles
  • 1038.509 kilometers
  • 560.750 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from London to Bar Harbor?

The estimated flight time from London International Airport to Hancock County–Bar Harbor Airport is 1 hour and 43 minutes.

What is the time difference between London and Bar Harbor?

There is no time difference between London and Bar Harbor.

Flight carbon footprint between London International Airport (YXU) and Hancock County–Bar Harbor Airport (BHB)

On average, flying from London to Bar Harbor generates about 118 kg of CO2 per passenger, and 118 kilograms equals 261 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from London to Bar Harbor

See the map of the shortest flight path between London International Airport (YXU) and Hancock County–Bar Harbor Airport (BHB).

Airport information

Origin London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W
Destination Hancock County–Bar Harbor Airport
City: Bar Harbor, ME
Country: United States Flag of United States
IATA Code: BHB
ICAO Code: KBHB
Coordinates: 44°27′0″N, 68°21′41″W