#include <bitset>
#include <iostream>
#include <string>
usingnamespace std;
int main()
{
// The source 'binary' number
string s = "0101101101";
// Convert it to an actual number
unsignedlong n = bitset <32> ( s ).to_ulong();
// Display it
cout << s << " binary == " << n << " decimal\n";
return 0;
}
I'm surprised it took so long for someone to suggest a std::bit set. I'm equally surprised that the user didn't respond with, "sorry my teacher hasn't covered that in class yet". Also, boost has dynamic_bitset which has the nice feature of allowing you to specify the number of bits at run-time. It is fun to toy around with.
I wonder if the OP's idea of "fastest" was misunderstood. Did he mean fastest as in how do I get the job done in the "fastest' amount of time, or in terms of the speed of the algorithm used? I'm guessing the former because unless you are writing some fancy professional algorithm for a flight control computer or something to that effect it shouldn't matter which is slightly faster. I'd be more concerned with what works and is the least convuluted to read.
I found the std::bitset from Duoas interesting never having seen it before.
The library function strtol() will also serve and it can accept the string direct from the keyboard.
eg:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
#include<iostream>
#include<cstdlib>
#include<stdio.h>
usingnamespace std;
int main()
{
cout<<"Enter a binary string to convert to dec format \n";
char* p;
char str[100];
long n = strtol(gets(str),&p,2);//2 is the binary radix
cout<<"This equates to a dec of "<<n;
cout<<"\n";
getchar();
return 0;
}