Air Miles Calculator logo

How far is Sendai from Ube?

The distance between Ube (Yamaguchi Ube Airport) and Sendai (Sendai Airport) is 612 miles / 985 kilometers / 532 nautical miles.

The driving distance from Ube (UBJ) to Sendai (SDJ) is 781 miles / 1257 kilometers, and travel time by car is about 15 hours 18 minutes.

Yamaguchi Ube Airport – Sendai Airport

Distance arrow
612
Miles
Distance arrow
985
Kilometers
Distance arrow
532
Nautical miles

Search flights

Distance from Ube to Sendai

There are several ways to calculate the distance from Ube to Sendai. Here are two standard methods:

Vincenty's formula (applied above)
  • 612.354 miles
  • 985.487 kilometers
  • 532.121 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 611.566 miles
  • 984.220 kilometers
  • 531.437 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ube to Sendai?

The estimated flight time from Yamaguchi Ube Airport to Sendai Airport is 1 hour and 39 minutes.

What is the time difference between Ube and Sendai?

There is no time difference between Ube and Sendai.

Flight carbon footprint between Yamaguchi Ube Airport (UBJ) and Sendai Airport (SDJ)

On average, flying from Ube to Sendai generates about 114 kg of CO2 per passenger, and 114 kilograms equals 252 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ube to Sendai

See the map of the shortest flight path between Yamaguchi Ube Airport (UBJ) and Sendai Airport (SDJ).

Airport information

Origin Yamaguchi Ube Airport
City: Ube
Country: Japan Flag of Japan
IATA Code: UBJ
ICAO Code: RJDC
Coordinates: 33°55′48″N, 131°16′44″E
Destination Sendai Airport
City: Sendai
Country: Japan Flag of Japan
IATA Code: SDJ
ICAO Code: RJSS
Coordinates: 38°8′22″N, 140°55′1″E