Air Miles Calculator logo

How far is North Bay from London?

The distance between London (London International Airport) and North Bay (North Bay/Jack Garland Airport) is 245 miles / 394 kilometers / 213 nautical miles.

The driving distance from London (YXU) to North Bay (YYB) is 317 miles / 510 kilometers, and travel time by car is about 6 hours 39 minutes.

London International Airport – North Bay/Jack Garland Airport

Distance arrow
245
Miles
Distance arrow
394
Kilometers
Distance arrow
213
Nautical miles

Search flights

Distance from London to North Bay

There are several ways to calculate the distance from London to North Bay. Here are two standard methods:

Vincenty's formula (applied above)
  • 245.088 miles
  • 394.431 kilometers
  • 212.975 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 245.140 miles
  • 394.514 kilometers
  • 213.021 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from London to North Bay?

The estimated flight time from London International Airport to North Bay/Jack Garland Airport is 57 minutes.

What is the time difference between London and North Bay?

There is no time difference between London and North Bay.

Flight carbon footprint between London International Airport (YXU) and North Bay/Jack Garland Airport (YYB)

On average, flying from London to North Bay generates about 61 kg of CO2 per passenger, and 61 kilograms equals 135 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from London to North Bay

See the map of the shortest flight path between London International Airport (YXU) and North Bay/Jack Garland Airport (YYB).

Airport information

Origin London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W
Destination North Bay/Jack Garland Airport
City: North Bay
Country: Canada Flag of Canada
IATA Code: YYB
ICAO Code: CYYB
Coordinates: 46°21′48″N, 79°25′22″W