How far is St John's from Los Angeles, CA?
The distance between Los Angeles (Los Angeles International Airport) and St John's (V. C. Bird International Airport) is 3672 miles / 5909 kilometers / 3191 nautical miles.
Los Angeles International Airport – V. C. Bird International Airport
Search flights
Distance from Los Angeles to St John's
There are several ways to calculate the distance from Los Angeles to St John's. Here are two standard methods:
Vincenty's formula (applied above)- 3671.637 miles
- 5908.928 kilometers
- 3190.566 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3667.232 miles
- 5901.838 kilometers
- 3186.738 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Los Angeles to St John's?
The estimated flight time from Los Angeles International Airport to V. C. Bird International Airport is 7 hours and 27 minutes.
What is the time difference between Los Angeles and St John's?
Flight carbon footprint between Los Angeles International Airport (LAX) and V. C. Bird International Airport (ANU)
On average, flying from Los Angeles to St John's generates about 416 kg of CO2 per passenger, and 416 kilograms equals 916 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Los Angeles to St John's
See the map of the shortest flight path between Los Angeles International Airport (LAX) and V. C. Bird International Airport (ANU).
Airport information
Origin | Los Angeles International Airport |
---|---|
City: | Los Angeles, CA |
Country: | United States |
IATA Code: | LAX |
ICAO Code: | KLAX |
Coordinates: | 33°56′33″N, 118°24′28″W |
Destination | V. C. Bird International Airport |
---|---|
City: | St John's |
Country: | Antigua and Barbuda |
IATA Code: | ANU |
ICAO Code: | TAPA |
Coordinates: | 17°8′12″N, 61°47′33″W |