I'm having trouble with a program and was hoping someone could help me out. I've tried a few things but they don't seem to be working. Here's the instructions to it from the book
Write a C++ program that will compute the distance and time it will take (days/hours) for a hurricane to reach Ft. Lauderdale if:
1) The hurricane is at coordinates 16, -64
way south of Fla. with a speed of 20 mph
2) The hurricane is at Bermuda with a speed of 10 mph
Note: your program must be run twice for the above.
Your program should be interactive with 3 inputs, using only a little high school algebra ( Py. Theorem) and the math library. You must also make an assumption of how many miles there are to a degree in our hemisphere.
Any help would be much appricitated, as it'll be used in the next few months to calculate the hurricanes distance.
If I were to start this program. The first thing I would do is find the distance between the hurricane between the given coord and Ft.Lauderdale. D = sqrt[((X2-X1)^2)+((Y2-Y1)^2)]
2nd. Convert that disance into realworld miles.
3. Divide that with the 10mph to get the hrs. (Assuming the 20mph is the wind speed of the hurricane. Which doesn't make much sense to me either)..^_^
4. Convert hrs into Days,Hours, and Min.