binary to interger

Aug 13, 2009 at 1:56pm
Hi,

I'am looking for the fastest way to convert the a binary number into decimal value, The binary number is a char array of 0 and 1.

So far, i come up with:

main() {
char c[10] = {0, 1, 0, 1, 1, 0, 1, 1, 0, 1};
convert(c, 10);
}


int convert(char *c, int size) {

int v = 0;

for(int i = 0; i < size; i++) {

v = (v << 1) + c[i];

}

return v;
}

any clever ideas?
Thanks
Aug 13, 2009 at 2:11pm
closed account (z05DSL3A)
Convert string to unsigned long integer
http://www.cplusplus.com/reference/clibrary/cstdlib/strtoul/
Aug 13, 2009 at 3:30pm
I'd switch from char to bool for starters.

The formula for converting binary to an integer would be 2^position (starting from 0) * value (0 or 1)
Aug 13, 2009 at 5:47pm
Grey Wolf: Huh?

Mac: Here's a lengthy thread on the fastest methods to perform the reverse operation: http://www.cplusplus.com/forum/general/10898/
I'm sure you'll be able to figure it out.
Aug 13, 2009 at 6:45pm
closed account (z05DSL3A)
My bad, misread The binary number is a char array of 0 and 1 to mean the chars '0' and '1'.

I was thinking along the lines of:

1
2
3
4
5
6
7
8
9
10
11
12
13
#include <iostream>
#include <stdio.h>
#include <stdlib.h>

int main ()
{
  char num[10] = {'0', '1', '0', '1', '1', '0', '1', '1', '0', '1', '\0' };
  long ul;
  
  ul = strtol(num,NULL,2);
  std::cout << num << " = " << ul << std::endl;
  return 0;
}
Aug 14, 2009 at 12:07am
Use a std::bitset
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
#include <bitset>
#include <iostream>
#include <string>
using namespace std;

int main()
  {
  // The source 'binary' number
  string s = "0101101101";

  // Convert it to an actual number
  unsigned long n = bitset <32> ( s ).to_ulong();

  // Display it
  cout << s << " binary == " << n << " decimal\n";

  return 0;
  }

Enjoy!
Aug 17, 2009 at 5:08pm
I'm surprised it took so long for someone to suggest a std::bit set. I'm equally surprised that the user didn't respond with, "sorry my teacher hasn't covered that in class yet". Also, boost has dynamic_bitset which has the nice feature of allowing you to specify the number of bits at run-time. It is fun to toy around with.

I wonder if the OP's idea of "fastest" was misunderstood. Did he mean fastest as in how do I get the job done in the "fastest' amount of time, or in terms of the speed of the algorithm used? I'm guessing the former because unless you are writing some fancy professional algorithm for a flight control computer or something to that effect it shouldn't matter which is slightly faster. I'd be more concerned with what works and is the least convuluted to read.
Last edited on Aug 17, 2009 at 5:16pm
Aug 22, 2009 at 4:39am
I found the std::bitset from Duoas interesting never having seen it before.
The library function strtol() will also serve and it can accept the string direct from the keyboard.
eg:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
#include<iostream>
#include<cstdlib>
#include<stdio.h>
using namespace std;
int main()
{
cout<<"Enter a binary string to convert to dec format \n";
char* p;
char str[100];
long n = strtol(gets(str),&p,2);//2 is the binary radix
cout<<"This equates to a dec of "<<n;
cout<<"\n";
getchar();

return 0;
}

Edit:Sorry i missed the Grey Wolf pointer.
Last edited on Aug 22, 2009 at 4:42am
Aug 22, 2009 at 4:54am
Don't use gets()...that is all...
Topic archived. No new replies allowed.