int bufsize1;
int bufsize2 = 80;
cout << sizeof (char)* bufsize1 << endl;
cout << sizeof (char)* bufsize2 << endl;
The 1st cout statement gives an O/P of 0.
However, the 2nd cout statement gives an O/P of 80.
Here's what I see is happening here:
1) In the cout statement, an int is cast to a char pointer.
2) Then the sizeof operator is applied to the above result.
However, I don't understand why the 2nd statement results in a char pointer that is 80 bytes long. Why does the cast take into account the value of bufsize2?