I wrote some simple (apparently not simple enough for me) code to try to familiarize myself with bit shift. The goal was to just use bits 2 and 3 and discard bits 0,1,4-7. I have:
What I thought would happen is this:
123=
01111011.
then << 2 gives
11101100
and finally >> 6 gives
00000011
which should be 3. However, I get 7 as my output. Am I doing something wrong?
a<<2 evaluates to an int, so the most significant 1 doesn't get cut off. 123<<2==(int)492, not (unsigned char)236. To truncate the number, do (a<<2)&255.