I'm trying to build a program that will take a dollar amount, a percentage, and a number of years. In a for loop, the program is supposed to multiply the dollar amount by the percentage and add that number to the original dollar amount. However every time the for loop executes, instead of using the new dollar amount, the program uses the original dollar amount so that if more than one year was entered, the output will be incorrect. I believe my problem lies within my for loop. Here's the code: