Air Miles Calculator logo

How far is Prince George from Texada?

The distance between Texada (Texada/Gillies Bay Airport) and Prince George (Prince George Airport) is 301 miles / 484 kilometers / 261 nautical miles.

The driving distance from Texada (YGB) to Prince George (YXS) is 608 miles / 979 kilometers, and travel time by car is about 15 hours 9 minutes.

Texada/Gillies Bay Airport – Prince George Airport

Distance arrow
301
Miles
Distance arrow
484
Kilometers
Distance arrow
261
Nautical miles

Search flights

Distance from Texada to Prince George

There are several ways to calculate the distance from Texada to Prince George. Here are two standard methods:

Vincenty's formula (applied above)
  • 300.536 miles
  • 483.666 kilometers
  • 261.159 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 300.299 miles
  • 483.284 kilometers
  • 260.953 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Texada to Prince George?

The estimated flight time from Texada/Gillies Bay Airport to Prince George Airport is 1 hour and 4 minutes.

What is the time difference between Texada and Prince George?

There is no time difference between Texada and Prince George.

Flight carbon footprint between Texada/Gillies Bay Airport (YGB) and Prince George Airport (YXS)

On average, flying from Texada to Prince George generates about 69 kg of CO2 per passenger, and 69 kilograms equals 153 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Texada to Prince George

See the map of the shortest flight path between Texada/Gillies Bay Airport (YGB) and Prince George Airport (YXS).

Airport information

Origin Texada/Gillies Bay Airport
City: Texada
Country: Canada Flag of Canada
IATA Code: YGB
ICAO Code: CYGB
Coordinates: 49°41′39″N, 124°31′4″W
Destination Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W