The distance between the two planes, one located 300 miles north and the other 400 miles east of the airport, is 500 miles. This is calculated using the distance formula. The two planes' coordinates can be used to apply the formula directly.
;
To find the distance between the two planes, you can use the Pythagorean Theorem. This theorem is useful for calculating the length of the sides of a right triangle.
The question describes the situation where one plane is 300 miles north and the other is 400 miles east of Bole airport. These distances form the two perpendicular legs of a right triangle, with the distance between the two planes acting as the hypotenuse.
The Pythagorean Theorem states:
a 2 + b 2 = c 2
where:
a and b are the lengths of the legs of the triangle,
c is the length of the hypotenuse.
In this scenario:
a = 300 miles (the distance north),
b = 400 miles (the distance east).
Plug these values into the equation:
30 0 2 + 40 0 2 = c 2
Calculate each term:
90000 + 160000 = c 2
Add the numbers:
250000 = c 2
To find c , take the square root of both sides:
c = 250000
c = 500
Therefore, the distance between the two planes is 500 miles.