Difference between char and wchar_t

In a simple program we declare a char and a wchar_t variable.
We initialize them using some values.
And we display the output using cout.

char value=55, will output 7.  It will output the char value that corresponds to decimal 55.
char value=41, will output ), for the same reason.
char value='g', will output g.  It will output the character g.  and if we want the corresponding decimal we cast an int to it.
Right?? I think that is right.

wchar_t does the same job as char but it can store more characters (or not?)
wchar_t value='g' will output 103 and not g.

Why is that?

Obviously I am missing something or there is something wrong with my reasoning.

char value1 = 55;
cout << "char 55 " << value1<<endl;
char value2 = 41;
cout << "char 41 " << value2 << endl;
char value3 = 'g';
cout << "char 'g' " << value3 << endl;
cout << "casting an int to char 'g'" << int(value3) << endl;

wchar_t value4 = 'g';
cout << "wchar_t ''g' " << value4 << endl
Last edited on
wchar_t does the same job as char but it can store more characters (or not?)
It is the same difference as e.g. int/short: The value range. If int is 32bit and short 16bit you can theoretically store two shorts in one int.

wchar_t value='g' will output 103 and not g.
This depends whether the output recognize wchar_t or not.
Topic archived. No new replies allowed.