As the name says, I have to write a program that tells you how many coins there are by taking the number of each type of coin from the user and adding them all up. Cents only at first, then it has to improved afterwards to show you the amount of money in Dollars and Cents. I'm not sure about the formula, though.
For the half dollar and the dollar coin, would it be okay to first divide 100 by 2 and then multiply it by the amount of half dollars entered, and then to take the amount of dollar coins entered and multiply it by 100? And to also a similar thing to other types of coins as well, like for example multiplying 1 by the number pennies, multiplying 25 by the number of quarters, etc.?
Edit: Never mind. I was able to do it on my own, thankfully.