Air Miles Calculator logo

How far is Long Beach, CA, from Prince George?

The distance between Prince George (Prince George Airport) and Long Beach (Long Beach Airport) is 1403 miles / 2259 kilometers / 1220 nautical miles.

The driving distance from Prince George (YXS) to Long Beach (LGB) is 1711 miles / 2754 kilometers, and travel time by car is about 33 hours 45 minutes.

Prince George Airport – Long Beach Airport

Distance arrow
1403
Miles
Distance arrow
2259
Kilometers
Distance arrow
1220
Nautical miles

Search flights

Distance from Prince George to Long Beach

There are several ways to calculate the distance from Prince George to Long Beach. Here are two standard methods:

Vincenty's formula (applied above)
  • 1403.382 miles
  • 2258.525 kilometers
  • 1219.506 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1404.337 miles
  • 2260.061 kilometers
  • 1220.335 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Prince George to Long Beach?

The estimated flight time from Prince George Airport to Long Beach Airport is 3 hours and 9 minutes.

What is the time difference between Prince George and Long Beach?

There is no time difference between Prince George and Long Beach.

Flight carbon footprint between Prince George Airport (YXS) and Long Beach Airport (LGB)

On average, flying from Prince George to Long Beach generates about 173 kg of CO2 per passenger, and 173 kilograms equals 382 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Prince George to Long Beach

See the map of the shortest flight path between Prince George Airport (YXS) and Long Beach Airport (LGB).

Airport information

Origin Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W
Destination Long Beach Airport
City: Long Beach, CA
Country: United States Flag of United States
IATA Code: LGB
ICAO Code: KLGB
Coordinates: 33°49′3″N, 118°9′7″W