I am creating a binary to ASCII converter. The way the program works is it accepts a string input, converts it to an array of ints, and converts it to ascii. The input part of the code isn't working, though:
#include <iostream>
#include <cstring>
#include <cmath>
usingnamespace std;
int main()
{
//Your stuff:
string theString;
cin >> theString;
int stringLength = strlen(theString.c_str());
int intString[stringLength];
int value=0;
//Inserting integer values into array:
for(int i=0;i<stringLength;++i)
intString[i] = theString[i]-'0';
//Converting string integer into single string value.
for(int i=stringLength-1;i>=0;--i)
if(intString[i])
value+=pow(2,stringLength-(i+1));
//Display value as integer:
cout<<"The integer value: "<<value<<endl;
//Display value as character:
cout<<"The ASCII character: "<<char(value)<<endl;
//Custom Exit:
cout<<"\nPress enter to continue: ";
cin.ignore(255,'\n');
cin.clear();
cin.get();
return 0;
}
10101010
The integer value: 170
The ASCII character: ¬
Press enter to continue:
But I do not believe this to be standard(C++) and will not work with all compilers. http://gcc.gnu.org/onlinedocs/gcc-4.1.2/gcc/Variable-Length.html#Variable-Length