Air Miles Calculator logo

How far is Beijing from Singapore?

The distance between Singapore (Singapore Changi Airport) and Beijing (Beijing Daxing International Airport) is 2740 miles / 4410 kilometers / 2381 nautical miles.

The driving distance from Singapore (SIN) to Beijing (PKX) is 3664 miles / 5896 kilometers, and travel time by car is about 69 hours 6 minutes.

Singapore Changi Airport – Beijing Daxing International Airport

Distance arrow
2740
Miles
Distance arrow
4410
Kilometers
Distance arrow
2381
Nautical miles

Search flights

Distance from Singapore to Beijing

There are several ways to calculate the distance from Singapore to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 2740.089 miles
  • 4409.746 kilometers
  • 2381.072 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2750.072 miles
  • 4425.811 kilometers
  • 2389.747 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Singapore to Beijing?

The estimated flight time from Singapore Changi Airport to Beijing Daxing International Airport is 5 hours and 41 minutes.

What is the time difference between Singapore and Beijing?

There is no time difference between Singapore and Beijing.

Flight carbon footprint between Singapore Changi Airport (SIN) and Beijing Daxing International Airport (PKX)

On average, flying from Singapore to Beijing generates about 303 kg of CO2 per passenger, and 303 kilograms equals 669 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Singapore to Beijing

See the map of the shortest flight path between Singapore Changi Airport (SIN) and Beijing Daxing International Airport (PKX).

Airport information

Origin Singapore Changi Airport
City: Singapore
Country: Singapore Flag of Singapore
IATA Code: SIN
ICAO Code: WSSS
Coordinates: 1°21′0″N, 103°59′38″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E