binary to decimal

Jan 2, 2013 at 3:18pm
Hi friends,
I am trying to find the decimal form of a binary number and I have done till here and am not able to detect where I am going wrong .
#include<iostream.h>
#include<conio.h>
void main()
{
clrscr();
int a,arr[20],ar[20],sum=0;
cout<<"enter the digits in the decimal form";
cin>>a;
cout<<"enter the number in decimal form";
for(int i=0;i<=a;i++)
{
cin>>arr[i];
}
for(int j=a-1;j>=0;j--)
{
ar[i]=(arr[i]*2)^j;
sum=sum+ar[i];
}
cout<<sum;
getch();
}Can anybody help..???
Jan 2, 2013 at 3:49pm
The code does not compile
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
#include<iostream.h>
#include<conio.h>
void main()
{
    clrscr();

    int a,arr[20],ar[20],sum=0;

    cout<<"enter the digits in the decimal form";
    cin>>a;
    cout<<"enter the number in decimal form";
    for(int i=0;i<=a;i++)
    {
        cin>>arr[i];
    }

    for(int j=a-1;j>=0;j--)
    {
        ar[i]=(arr[i]*2)^j;
        sum=sum+ar[i];
    }

    cout<<sum;

    getch();
}

At line 19, i is undefined.
If your compiler accepts this with no error, then I'd recommend upgrading to a newer more standard compiler.

I've not tried to understand your algorithm, no comments there.
However the use of the bitwise XOR operator ^ looks a bit suspicious.
Last edited on Jan 2, 2013 at 3:52pm
Topic archived. No new replies allowed.