I have a debugging problem at this program but when i debug this program i found that instead of i=0.1
it is i=0.1000000000001 which is the core of the problem, any one have a suggestion how to be just only 0.1 ,i would be thankful for him
the code is
Please, write the result of the division 1/3 in decimal. Completely.
0.1 is not representable as finite binary fraction. So it is not possible to accurate represent it. It is only possible with specific precision. You should use some math library to accurately represent decimals. Or you can try to get rid of floating point values completely.
And actually even if you get i to be exactly 0.1, you still not get the answer. Because it is really 1/3. Which you do not get if you will infinetely add 0.1.