Air Miles Calculator logo

How far is Windsor from Surabaya?

The distance between Surabaya (Juanda International Airport) and Windsor (Windsor International Airport) is 9844 miles / 15843 kilometers / 8554 nautical miles.

Juanda International Airport – Windsor International Airport

Distance arrow
9844
Miles
Distance arrow
15843
Kilometers
Distance arrow
8554
Nautical miles
Flight time duration
19 h 8 min
CO2 emission
1 277 kg

Search flights

Distance from Surabaya to Windsor

There are several ways to calculate the distance from Surabaya to Windsor. Here are two standard methods:

Vincenty's formula (applied above)
  • 9844.135 miles
  • 15842.600 kilometers
  • 8554.320 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9841.371 miles
  • 15838.152 kilometers
  • 8551.918 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Surabaya to Windsor?

The estimated flight time from Juanda International Airport to Windsor International Airport is 19 hours and 8 minutes.

Flight carbon footprint between Juanda International Airport (SUB) and Windsor International Airport (YQG)

On average, flying from Surabaya to Windsor generates about 1 277 kg of CO2 per passenger, and 1 277 kilograms equals 2 816 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Surabaya to Windsor

See the map of the shortest flight path between Juanda International Airport (SUB) and Windsor International Airport (YQG).

Airport information

Origin Juanda International Airport
City: Surabaya
Country: Indonesia Flag of Indonesia
IATA Code: SUB
ICAO Code: WARR
Coordinates: 7°22′47″S, 112°47′13″E
Destination Windsor International Airport
City: Windsor
Country: Canada Flag of Canada
IATA Code: YQG
ICAO Code: CYQG
Coordinates: 42°16′32″N, 82°57′20″W