Air Miles Calculator logo

How far is Beijing from Singapore?

The distance between Singapore (Singapore Changi Airport) and Beijing (Beijing Nanyuan Airport) is 2757 miles / 4438 kilometers / 2396 nautical miles.

The driving distance from Singapore (SIN) to Beijing (NAY) is 3679 miles / 5921 kilometers, and travel time by car is about 69 hours 30 minutes.

Singapore Changi Airport – Beijing Nanyuan Airport

Distance arrow
2757
Miles
Distance arrow
4438
Kilometers
Distance arrow
2396
Nautical miles

Search flights

Distance from Singapore to Beijing

There are several ways to calculate the distance from Singapore to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 2757.464 miles
  • 4437.709 kilometers
  • 2396.171 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2767.486 miles
  • 4453.837 kilometers
  • 2404.880 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Singapore to Beijing?

The estimated flight time from Singapore Changi Airport to Beijing Nanyuan Airport is 5 hours and 43 minutes.

What is the time difference between Singapore and Beijing?

There is no time difference between Singapore and Beijing.

Flight carbon footprint between Singapore Changi Airport (SIN) and Beijing Nanyuan Airport (NAY)

On average, flying from Singapore to Beijing generates about 305 kg of CO2 per passenger, and 305 kilograms equals 673 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Singapore to Beijing

See the map of the shortest flight path between Singapore Changi Airport (SIN) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Singapore Changi Airport
City: Singapore
Country: Singapore Flag of Singapore
IATA Code: SIN
ICAO Code: WSSS
Coordinates: 1°21′0″N, 103°59′38″E
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E