Air Miles Calculator logo

How far is Wekweètì from Tulita?

The distance between Tulita (Tulita Airport) and Wekweètì (Wekweètì Airport) is 346 miles / 556 kilometers / 300 nautical miles.

The driving distance from Tulita (ZFN) to Wekweètì (YFJ) is 804 miles / 1294 kilometers, and travel time by car is about 27 hours 12 minutes.

Tulita Airport – Wekweètì Airport

Distance arrow
346
Miles
Distance arrow
556
Kilometers
Distance arrow
300
Nautical miles

Search flights

Distance from Tulita to Wekweètì

There are several ways to calculate the distance from Tulita to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 345.748 miles
  • 556.427 kilometers
  • 300.447 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 344.427 miles
  • 554.301 kilometers
  • 299.299 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tulita to Wekweètì?

The estimated flight time from Tulita Airport to Wekweètì Airport is 1 hour and 9 minutes.

What is the time difference between Tulita and Wekweètì?

There is no time difference between Tulita and Wekweètì.

Flight carbon footprint between Tulita Airport (ZFN) and Wekweètì Airport (YFJ)

On average, flying from Tulita to Wekweètì generates about 76 kg of CO2 per passenger, and 76 kilograms equals 167 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tulita to Wekweètì

See the map of the shortest flight path between Tulita Airport (ZFN) and Wekweètì Airport (YFJ).

Airport information

Origin Tulita Airport
City: Tulita
Country: Canada Flag of Canada
IATA Code: ZFN
ICAO Code: CZFN
Coordinates: 64°54′34″N, 125°34′22″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W