Air Miles Calculator logo

How far is Masuda from Monbetsu?

The distance between Monbetsu (Monbetsu Airport) and Masuda (Iwami Airport) is 907 miles / 1460 kilometers / 788 nautical miles.

The driving distance from Monbetsu (MBE) to Masuda (IWJ) is 1304 miles / 2098 kilometers, and travel time by car is about 26 hours 55 minutes.

Monbetsu Airport – Iwami Airport

Distance arrow
907
Miles
Distance arrow
1460
Kilometers
Distance arrow
788
Nautical miles

Search flights

Distance from Monbetsu to Masuda

There are several ways to calculate the distance from Monbetsu to Masuda. Here are two standard methods:

Vincenty's formula (applied above)
  • 907.331 miles
  • 1460.207 kilometers
  • 788.449 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 907.037 miles
  • 1459.735 kilometers
  • 788.194 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Monbetsu to Masuda?

The estimated flight time from Monbetsu Airport to Iwami Airport is 2 hours and 13 minutes.

What is the time difference between Monbetsu and Masuda?

There is no time difference between Monbetsu and Masuda.

Flight carbon footprint between Monbetsu Airport (MBE) and Iwami Airport (IWJ)

On average, flying from Monbetsu to Masuda generates about 144 kg of CO2 per passenger, and 144 kilograms equals 318 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Monbetsu to Masuda

See the map of the shortest flight path between Monbetsu Airport (MBE) and Iwami Airport (IWJ).

Airport information

Origin Monbetsu Airport
City: Monbetsu
Country: Japan Flag of Japan
IATA Code: MBE
ICAO Code: RJEB
Coordinates: 44°18′14″N, 143°24′14″E
Destination Iwami Airport
City: Masuda
Country: Japan Flag of Japan
IATA Code: IWJ
ICAO Code: RJOW
Coordinates: 34°40′35″N, 131°47′23″E