Air Miles Calculator logo

How far is Jixi from Sanming?

The distance between Sanming (Shaxian Airport) and Jixi (Jixi Xingkaihu Airport) is 1497 miles / 2409 kilometers / 1301 nautical miles.

The driving distance from Sanming (SQJ) to Jixi (JXA) is 2010 miles / 3234 kilometers, and travel time by car is about 36 hours 16 minutes.

Shaxian Airport – Jixi Xingkaihu Airport

Distance arrow
1497
Miles
Distance arrow
2409
Kilometers
Distance arrow
1301
Nautical miles

Search flights

Distance from Sanming to Jixi

There are several ways to calculate the distance from Sanming to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 1496.747 miles
  • 2408.781 kilometers
  • 1300.638 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1498.323 miles
  • 2411.318 kilometers
  • 1302.007 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sanming to Jixi?

The estimated flight time from Shaxian Airport to Jixi Xingkaihu Airport is 3 hours and 20 minutes.

What is the time difference between Sanming and Jixi?

There is no time difference between Sanming and Jixi.

Flight carbon footprint between Shaxian Airport (SQJ) and Jixi Xingkaihu Airport (JXA)

On average, flying from Sanming to Jixi generates about 179 kg of CO2 per passenger, and 179 kilograms equals 395 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sanming to Jixi

See the map of the shortest flight path between Shaxian Airport (SQJ) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Shaxian Airport
City: Sanming
Country: China Flag of China
IATA Code: SQJ
ICAO Code: ZSSM
Coordinates: 26°25′34″N, 117°50′0″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E