I can't seem to get the expected out for the following program:
Summary
Interest on a credit card’s unpaid balance is calculated using the average daily balance.
Suppose that netBalance is the balance shown in the bill, payment is the payment made, d1 is the number of days in the billing cycle, and d2 is the number of days payment is made before billing cycle.
If the interest rate per month is, say, 0.0152, then the interest on the unpaid balance is:
interest = averageDailyBalance * 0.0152
Instructions
Write a program that accepts as input netBalance, d1, payment, d2, and interest rate per month (interestRate).
The program outputs the interest.
Format your output to two decimal places.
Grading
When you have completed your program, click the Submit button to record your score.
this is my code:
#include <iostream>
#include <iomanip>
using namespace std;
int main() {
// Write your main here
double netBalance,d1,payment,interest,d2,interestRate,averageDailyBalance;
cout << setprecision(2) << fixed << showpoint;
cout << "Enter net Balance: ";
cin >> netBalance;
cout << "Enter payment made: ";
cin >> payment;
cout << "Enter number of days in the billing cycle: ";
cin >> d1;
cout << "Enter number of days payment is made before billing cycle: ";
cin >> d2;
cout << "Enter interest per month: ";
cin >> interestRate;
is there any chance you confused which input is which here?
also, humans know that interest is a %
that is, 9.7 % of 100 is (100*0.097) = 9.7 ....
your math is right, but should you have typed 0.097??
your code is fine. I suspect the problem is the data.
also, the answer will be different if you round or trunc each part to 2 decimals vs round at the end. Unclear which you want here (usually round / trunc at the last step is right).
finally, double check the expected output by hand. Can you get it with a calculator?