Because you need the window class name again when you create the window, and for some other things too (like you normally give the default menu the same name as the window class).
Because i have made the variable which stores the class name a TCHAR, which supports unicode, is it a MUST that i have to use TCHAR for all my future character variables, or can i if i wanted to use char? Do you's have any tips on when its best to use TCHAR instead of char?
...and i know that to use TCHAR i have to #include<tchar.h> for TCHAR's to work, why do i not have to do it for window applications, does windows.h bring this in?
1. Because i have made the variable which stores the class name a TCHAR, which supports unicode, is it a MUST that i have to use TCHAR for all my future character variables, or can i if i wanted to use char? Do you's have any tips on when its best to use TCHAR instead of char?
2. If i want to use unicode (TCHAR) to support the unicode encoding scheme, do i have to also define _UNICODE? Because i read that you need to do that, but i havnt and i can still use TCHAR?
1. You can use what you like, so long as you take care to pass functions what they want. Many of the Windows API functions have both normal and wide versions that are called based on your definition of UNICODE; using TCHAR just means that you can then define or undefine UNICODE and all your code will still work.
2. Yes, you'll need to define UNICODE, though I believe Visual Studio actually has an option somewhere that does that automatically.
3. Yes, but you should be careful of using macros like that as they replace that set of text anywhere except in string literals. If you just want to alias a type, use a typedef instead.
One other thing, as i use the wide version of char to support unicode (which is TCHAR), are there other WIDE types, like a wide version of int, or was unicode only made to support international chars?
My reply to 2:
Ok, i guess my visual studio did automatically define UNICODE, ill define it anyway just for good practice.
My reply to 3:
Ok, so thats all macros are, they are just replacements. What is there purpose, is it for convinience reasons?
One other thing, as i use the wide version of char to support unicode (which is TCHAR), are there other WIDE types, like a wide version of int, or was unicode only made to support international chars?
wchar_t is NOT a char type for unicode characters. It's just wide characters, but whether wide means 8, 16, 32 or anything-else bit depends on your compiler.
ANSI/ISO C leaves the semantics of the wide character set to the specific implementation but requires that the characters from the portable C execution set correspond to their wide character equivalents by zero extension.
Though C compilers for windows usually define wchar_t as 16 bits, which allows you to use it for UTF-16. PS: You don't even need wchar_t for Unicode. With UTF-8 you can just use normal chars, though you won't be able to use the default string functions anymore then.
TCHAR is not a char type for unicode characters, it's just a typedef that allows you (when used together with a number of other macros and typedefs) to switch your entire code from using char-strings to using wchar_t strings with just defining the UNICODE macro.
As to other "wide" character types:
1 2 3 4 5
char a; /*at least 8 bits - the size of this is a byte in C++, and all other types must consist of bytes of this length */
shortint b; // at least as large as a char, not larger than an int
int c; //at least as large as a short, not larger than a long
longint d; //at least as large as a int, not larger than a long long
longlongint d; //at least as large as a long
Tell me if im right. I think i might be thinking of unicode in the wrong way.
Unicode is just a encoding scheme for all computers to use so that characters do not get mixed up when a program is being used on another computer in a different country.
So by telling my program to use the unicode scheme it is just a way of telling it how to handle characters, and it doesnt actually have anything to do with data types.
However, i have to use TCHAR because TCHAR can support 16bit characters whereas char cant.
Think of Unicode as a giant code table, where each number represents a character. The numbers are all 32 bits.
I think you still didn't get what TCHAR is. TCHAR is the same exact thing as char, just if you define the UNICODE macro before including the tchar header it becomes wchar_t instead. wchar_t is a wide character type. It is NOT guaranteed to be 16 bits, it could also be 32 bits (technically, other sizes are possible, but you won't have to worry about those in any environment you will normally encounter- and if you do encounter one of those, by then you should know how to deal with them).
That's basically the meaning of it. The number of bits just refers to how many bits a single character has. UTF-8, though, as has been mentioned, is somewhat special.