The ASCII table has nothing to do with it. char/signed char/unsigned char are just behaving like ordinary integer types.
Unsigned values wrap around so if you try to go below the smallest value it will instead go to the largest possible value and start going from there. The largest possible value that can be stored in an 8-bit unsigned value is 255. That is why you get 255 when you assign -1 to the unsigned char.
The values in your third example seems to be incorrect. If u is 0 when you do u-- the value of u will become 4294967295 because that is the largest possible value that u can have.
Signed values does not have this "wrap around" behaviour. The C++ standard simply says it's undefined what will happen if you go below or above the allowed range which is -128 ‒ 127 for signed 8-bit (using two's complement representation).