Air Miles Calculator logo

How far is Ljubljana from Hong Kong?

The distance between Hong Kong (Hong Kong International Airport) and Ljubljana (Ljubljana Jože Pučnik Airport) is 5556 miles / 8942 kilometers / 4828 nautical miles.

Hong Kong International Airport – Ljubljana Jože Pučnik Airport

Distance arrow
5556
Miles
Distance arrow
8942
Kilometers
Distance arrow
4828
Nautical miles

Search flights

Distance from Hong Kong to Ljubljana

There are several ways to calculate the distance from Hong Kong to Ljubljana. Here are two standard methods:

Vincenty's formula (applied above)
  • 5556.012 miles
  • 8941.534 kilometers
  • 4828.042 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5546.481 miles
  • 8926.196 kilometers
  • 4819.761 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hong Kong to Ljubljana?

The estimated flight time from Hong Kong International Airport to Ljubljana Jože Pučnik Airport is 11 hours and 1 minutes.

Flight carbon footprint between Hong Kong International Airport (HKG) and Ljubljana Jože Pučnik Airport (LJU)

On average, flying from Hong Kong to Ljubljana generates about 657 kg of CO2 per passenger, and 657 kilograms equals 1 448 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hong Kong to Ljubljana

See the map of the shortest flight path between Hong Kong International Airport (HKG) and Ljubljana Jože Pučnik Airport (LJU).

Airport information

Origin Hong Kong International Airport
City: Hong Kong
Country: Hong Kong Flag of Hong Kong
IATA Code: HKG
ICAO Code: VHHH
Coordinates: 22°18′32″N, 113°54′54″E
Destination Ljubljana Jože Pučnik Airport
City: Ljubljana
Country: Slovenia Flag of Slovenia
IATA Code: LJU
ICAO Code: LJLJ
Coordinates: 46°13′25″N, 14°27′27″E