What am I doing wrong with this array?

So in this method im trying to make a decimal to binary method that I can use later. when ever I enter a number for a I get some weird stuff for an output. like if I entered 6 for a, I get 2686808-192466809244744141985802686...."blah blah blah"....77024110

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
#include <iostream>


using namespace std;
void binary(int n)
{
    int binaryArray [8];
    int tempBinary;
    for(int i = 0; n != 0; i++)
    {
        tempBinary = n%2;
        binaryArray[i] = tempBinary;
        n = n/2;
    }
    for(int i = sizeof(binaryArray)-1; i >= 0; i--){
        cout << binaryArray[i];
    }


}

int main(int nNumberofArgs, char* pszArgs[])
{
    cout << "Please enter a value for 'A' and a value for 'B'";
    int a;
    int b;
    int n;
    cin >> a >> b;
    n = a;
    binary(n);
}



Any ideas?
Last edited on
Could you please use code tags?

And you got errors because you don't initialize the array completely. It is printing rubbish, I guess you entered 6 right? Look t the last 3 numbers, 110 is binary 6.

Why not use bit shifting:
1
2
3
4
5
6
7
8
9
10
for(int i=0;i<8;i++)
{
  binaryArray[i] = n & 0x01;
  n = n >> 1;
}

for(int i=7;i>=0;i++)
{
  cout << binaryArray[i];
}
Last edited on
The last 3 numbers are 110 which is binary for 6 - so it works!

Now you need to avoid outputting the other stuff, maybe if you just initialize the array to all 0 first the output will make more sense.
I went ahead and initialized all to zero. I still get all the jibberish in front but it now ends with 00000110.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
#include <iostream>


using namespace std;
void binary(int n)
{
	int binaryArray [8];
	int tempBinary;
	for(int i = 0; n != 0; i++)
	{
		tempBinary = n%2;
		binaryArray[i] = tempBinary;
		n = n/2;
	}
	for(int i = sizeof(binaryArray)-1; i >= 0; i--){
		cout << binaryArray[i];
	}


}

int main(int nNumberofArgs, char* pszArgs[])
{
	cout << "Please enter a value for 'A' and a value for 'B'";
	int a;
	int b;
	int n;
	cin >> a >> b;
	n = a;
	binary(n);
}


Look at line 15.

sizeof(binaryArray) is much larger than the number of elements in the array.

sizeof(binaryArray)/sizeof(binaryArray[0]) is correct.

Also if you make line 7:

int binaryArray [sizeof(int)*8] = {0};

There will be enough storage no matter what the value passed in is (making the assumption that bytes are 8 bits -- they aren't on all platforms) and all of the elements will be initialized to 0.
It is because you use sizeof... Sizeof gets the size of the array in bytes and your array is made of ints which are more than 1 byte each so it will result in a larger number than the 8 you expect... 8*sizeof(int)
Last edited on
Huzzah! Many thanks guys/gals!
Use itoa();

/* itoa example */
#include <stdio.h>
#include <stdlib.h>

1
2
3
4
5
6
7
8
9
10
11
12
13
14
int main ()
{
  int i;
  char buffer [33];
  printf ("Enter a number: ");
  scanf ("%d",&i);
  itoa (i,buffer,10);
  printf ("decimal: %s\n",buffer);
  itoa (i,buffer,16);
  printf ("hexadecimal: %s\n",buffer);
  itoa (i,buffer,2);
  printf ("binary: %s\n",buffer);
  return 0;
}
Since itoa() function is NOT an ANCI C++ function, here id another way to do it that I have been using in several micrcontrollers.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
#include <stdio.h>
#include<conio.h>

int main()

{
    printf("Binary value of 1234 is: %s\r\n", itob(1234); 
    getch();
}

// Integer to binary ascii string
char *itob(uint uiInt)
{
char szStr[33];
uint  uiTemp;
uchar ucPtr     = 0;
uchar ucStart   = 0;
uint  uiDivider = 0x8000;

// Because integer can be (32bit) or (16bit) then test for it.
// Compiler depended.
    if (sizeof(uiInt) > 2) {
        uiDivider = 0x80000000;
    }

   // Do all bits and start making string on first '1' binary
    do {
        uiTemp = (uiInt / uiDivider);
        if (uiTemp == 1) {
            uiInt -= uiDivider;
           // Set flag for first '1' to do string
            ucStart = 1;
        }
        uiDivider /= 2;
       // Make ascii
        szStr[ucPtr] = (char)(uiTemp + '0');
       // If result have been a '1' then increment pointer
        if (ucStart == 1) {
            ucPtr++;
        }
   // Repeat until last bit
    } while (uiDivider >= 2);

   // Make ascii
    szStr[ucPtr++] = (char)(uiInt + '0');
   // End string with NULL
    szStr[ucPtr] = 0;

    return szStr;
} 
Last edited on
Topic archived. No new replies allowed.