I am trying write a simple program to encode something using SHA-3 and the crypto++ libs. I have not found a tutorial on this. Rather I have look at some other post and tutorials which seem fairly similar and looking at the header file. So far I have this.
Basically the output is complete garbage. I think there is a memory leak as well because the last time I ran the program it screwed up the terminal. Technically, I don't even know if I am issuing the correct commands. I wasn't able to find any documentation and just went off of what was in the header.
I think the problem has to do with the conversion of types for the input. Using memcpy was the only way I could make it work. a lot of posts advocate using .c_str() however this produces a signed char and no amount of type casting which I applied was able to rectify the problem.
Also, not that it matters, but everyone else seem to use the string member function .length(), This doesn't work for me. However .size() works fine. What is the difference and why would that happen.
You should probably allocate memory here: byte out[hash.DigestSize()];
and here: unsignedchar buffer[in.size()];
Note in c++, char is 1 byte, so unless you have another byte class somewhere, you should be using char here
Not sure why string::length() is returning a different value from string::size(). According to the documentation, they should return the same values, Both string::size and string::length are synonyms and return the exact same value.
Since you are only hashing a single string you can use CalculateDigest() or CalculateTruncatedDigest().
1 2 3 4 5 6 7 8 9 10 11 12 13
int main(){
CryptoPP::SHA3_512 hash;
cout << hash.AlgorithmName() << " Test." << endl;
string in = "The quick brown fox jumps over the lazy dog";
vector<byte> out(hash.DigestSize());
hash.CalculateTruncatedDigest(&out[0], hash.DigestSize(), reinterpret_cast<byte*>(&in[0]), in.size());
cout << in << endl;
cout << out << endl;
}