Any help can be appreciated. The code is supposed to decode a string for example "!^&*" should output to "abcd". Right now it just decodes the first letter like this "a^&*". I feel like this is because it is defined as a char and not a string however it gives me error if i change it up. Any idea on how i can modify this to work? Thanks in advance :)
I changed all of the zeros to i's on lines 51,52 and it caused an out of range memory allocation error to be thrown. Any other ideas?
Oh, right, I missed that. If your string is less than 26 characters it will be an out of bounds error in that loop. So you need to use a nested for loop with the outside loop being the length of the string to be decoded and the inside loop being the number of possible characters to check against, or setNum. So basically lines 50-54 are going to be your inside loop with the zeros changed to the iterated int from the outside loop you have to create. Does that make sense? I ran it and that will work.