Air Miles Calculator logo

How far is Lianyungang from Seattle, WA?

The distance between Seattle (Seattle–Tacoma International Airport) and Lianyungang (Lianyungang Baitabu Airport) is 5626 miles / 9054 kilometers / 4889 nautical miles.

Seattle–Tacoma International Airport – Lianyungang Baitabu Airport

Distance arrow
5626
Miles
Distance arrow
9054
Kilometers
Distance arrow
4889
Nautical miles

Search flights

Distance from Seattle to Lianyungang

There are several ways to calculate the distance from Seattle to Lianyungang. Here are two standard methods:

Vincenty's formula (applied above)
  • 5625.601 miles
  • 9053.527 kilometers
  • 4888.514 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5612.233 miles
  • 9032.013 kilometers
  • 4876.897 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Seattle to Lianyungang?

The estimated flight time from Seattle–Tacoma International Airport to Lianyungang Baitabu Airport is 11 hours and 9 minutes.

Flight carbon footprint between Seattle–Tacoma International Airport (SEA) and Lianyungang Baitabu Airport (LYG)

On average, flying from Seattle to Lianyungang generates about 666 kg of CO2 per passenger, and 666 kilograms equals 1 469 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Seattle to Lianyungang

See the map of the shortest flight path between Seattle–Tacoma International Airport (SEA) and Lianyungang Baitabu Airport (LYG).

Airport information

Origin Seattle–Tacoma International Airport
City: Seattle, WA
Country: United States Flag of United States
IATA Code: SEA
ICAO Code: KSEA
Coordinates: 47°26′56″N, 122°18′32″W
Destination Lianyungang Baitabu Airport
City: Lianyungang
Country: China Flag of China
IATA Code: LYG
ICAO Code: ZSLG
Coordinates: 34°32′59″N, 119°15′0″E