I am trying to convert from ansi to unicode using the MultibyteToWideChar.
The problem is that when a new line comes the (function or the program) is unable to translate the 0d0a into the 0d00 0a00.
So i tried to manually add the new line but again deadin. The compiler was translating 0a as 0d0ad so i couldn't just write 0d00 0a00.(And there i lost every hope)
The code:
1 2 3 4 5 6 7 8 9
void writeUnicode(FILE *pFile, char *buffer)
{
int x=MultiByteToWideChar(0, 0, buffer, -1, NULL, 0);
wchar_t uniBuff[x];
MultiByteToWideChar(0, 0, buffer, x-1, uniBuff, x-1); //x-1 because i don't want the null
fwrite (uniBuff , 2, x-1, pFile );
}
That's not the compiler, it's the runtime. It's a language feature. When you open a file for text output (i.e. the default mode), 10 is translated to the operating system's default newline sequence, regardless of context. While this does have its uses, more often than not it's just annoying. If you need to conserve exact binary images, you'll have to open the file as binary. http://www.cplusplus.com/reference/clibrary/cstdio/fopen/
I myself wasted ten minutes yesterday debugging a file that was three kilobytes longer than it should until I realized I was opening the file as text output. If you ask me, it was a mistake on the part of K&R to make text the default mode.
Sigh... No it won't. That just makes TCHAR the same as WCHAR, rather than CHAR. In other words, it controls the type functions that take strings will receive.
The file is still being opened as text. Didn't you read what I said before?
Besides, the wide versions of file operations aren't really that useful. Also, I'm pretty sure you're suppposed to use wfwrite() and wfclose().