Air Miles Calculator logo

How far is Wekweètì from Kingston?

The distance between Kingston (Kingston Norman Rogers Airport) and Wekweètì (Wekweètì Airport) is 2001 miles / 3220 kilometers / 1739 nautical miles.

The driving distance from Kingston (YGK) to Wekweètì (YFJ) is 3288 miles / 5291 kilometers, and travel time by car is about 67 hours 26 minutes.

Kingston Norman Rogers Airport – Wekweètì Airport

Distance arrow
2001
Miles
Distance arrow
3220
Kilometers
Distance arrow
1739
Nautical miles

Search flights

Distance from Kingston to Wekweètì

There are several ways to calculate the distance from Kingston to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 2000.935 miles
  • 3220.193 kilometers
  • 1738.765 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1996.572 miles
  • 3213.171 kilometers
  • 1734.974 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Kingston to Wekweètì?

The estimated flight time from Kingston Norman Rogers Airport to Wekweètì Airport is 4 hours and 17 minutes.

Flight carbon footprint between Kingston Norman Rogers Airport (YGK) and Wekweètì Airport (YFJ)

On average, flying from Kingston to Wekweètì generates about 218 kg of CO2 per passenger, and 218 kilograms equals 480 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Kingston to Wekweètì

See the map of the shortest flight path between Kingston Norman Rogers Airport (YGK) and Wekweètì Airport (YFJ).

Airport information

Origin Kingston Norman Rogers Airport
City: Kingston
Country: Canada Flag of Canada
IATA Code: YGK
ICAO Code: CYGK
Coordinates: 44°13′31″N, 76°35′48″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W