Air Miles Calculator logo

How far is Texada from Kasabonika?

The distance between Kasabonika (Kasabonika Airport) and Texada (Texada/Gillies Bay Airport) is 1550 miles / 2494 kilometers / 1347 nautical miles.

The driving distance from Kasabonika (XKS) to Texada (YGB) is 2112 miles / 3399 kilometers, and travel time by car is about 51 hours 26 minutes.

Kasabonika Airport – Texada/Gillies Bay Airport

Distance arrow
1550
Miles
Distance arrow
2494
Kilometers
Distance arrow
1347
Nautical miles

Search flights

Distance from Kasabonika to Texada

There are several ways to calculate the distance from Kasabonika to Texada. Here are two standard methods:

Vincenty's formula (applied above)
  • 1549.686 miles
  • 2493.978 kilometers
  • 1346.641 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1544.889 miles
  • 2486.257 kilometers
  • 1342.472 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kasabonika to Texada?

The estimated flight time from Kasabonika Airport to Texada/Gillies Bay Airport is 3 hours and 26 minutes.

Flight carbon footprint between Kasabonika Airport (XKS) and Texada/Gillies Bay Airport (YGB)

On average, flying from Kasabonika to Texada generates about 183 kg of CO2 per passenger, and 183 kilograms equals 403 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kasabonika to Texada

See the map of the shortest flight path between Kasabonika Airport (XKS) and Texada/Gillies Bay Airport (YGB).

Airport information

Origin Kasabonika Airport
City: Kasabonika
Country: Canada Flag of Canada
IATA Code: XKS
ICAO Code: CYAQ
Coordinates: 53°31′28″N, 88°38′34″W
Destination Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W