So I'm trying to write a function that converts binary code to decimal. This is what I have so far:
1 2 3 4 5 6 7 8 9
int binaryToDecimal( bool vect[], int size )
{
int result = 0;
for (int i=0;i<size;i++)
{
result = vect[i]*(int)pow(2,i);
}
return result;
}
For those who don't know how to convert binary to decimal, let's say we have the binary 0101. To convert this to decimal, we do (0x2^3) + (1x2^2) + (0x2^1) + (1x2^0) = 5. How would I go about writing a function to do that? I think the function I wrote above converts each digit into its own decimal, but I'm having trouble adding up all of these values to get the corresponding decimal.
So I'm trying to write a function that converts binary code to decimal. This is what I have so far:
1 2 3 4 5 6 7 8 9
int binaryToDecimal( bool vect[], int size )
{
int result = 0;
for (int i=0;i<size;i++)
{
result = vect[i]*(int)pow(2,i);
}
return result;
}
For those who don't know how to convert binary to decimal, let's say we have the binary 0101. To convert this to decimal, we do (0x2^3) + (1x2^2) + (0x2^1) + (1x2^0) = 5. How would I go about writing a function to do that? I think the function I wrote above converts each digit into its own decimal, but I'm having trouble adding up all of these values to get the corresponding decimal.
1. You need in line #6: result += ... so the result accumulates.
2. You may not be getting the correct result because you move from left to right in the vector, but as you do, you should be decreasing the power of two used, but you are increasing it instead. This can be fixed by either storing the bits of the number in the vector in "reverse" order or you can just use a power of size - i.