# Distance between Lonorore (LNE) and South West Bay (SWJ)

Flight distance from Lonorore to South West Bay (Lonorore Airport – South West Bay Airport) is 65 miles / 105 kilometers / 57 nautical miles. Estimated flight time is 37 minutes.

Driving distance from Lonorore (LNE) to South West Bay (SWJ) is 15 miles / 24 kilometers and travel time by car is about 30 minutes.

65
Miles
105
Kilometers
57
Nautical miles

## How far is South West Bay from Lonorore?

There are several ways to calculate distances between Los Angeles and Chicago. Here are two common methods:

Vincenty's formula (applied above)
• 65.205 miles
• 104.938 kilometers
• 56.662 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth’s surface, using an ellipsoidal model of the earth.

Haversine formula
• 65.294 miles
• 105.081 kilometers
• 56.739 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

## How long does it take to fly from Lonorore to South West Bay?

Estimated flight time from Lonorore Airport to South West Bay Airport is 37 minutes.

## What is the time difference between Lonorore and South West Bay?

There is no time difference between Lonorore and South West Bay.

## Flight carbon footprint between Lonorore Airport (LNE) and South West Bay Airport (SWJ)

On average flying from Lonorore to South West Bay generates about 35 kg of CO2 per passenger, 35 kilograms is equal to 76 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

## Map of flight path and driving directions from Lonorore to South West Bay

Shortest flight path between Lonorore Airport (LNE) and South West Bay Airport (SWJ).

## Airport information

Origin Lonorore Airport
City: Lonorore
Country: Vanuatu
IATA Code: LNE
ICAO Code: NVSO
Coordinates: 15°51′56″S, 168°10′19″E
Destination South West Bay Airport
City: South West Bay
Country: Vanuatu
IATA Code: SWJ
ICAO Code: NVSX
Coordinates: 16°29′42″S, 167°26′16″E