Odd. That does sound like something I would say, but I can't recall ever saying anything like that.
Does this mean that console applications are doomed to present ANSI text only? Or am I missing something here? I mean, if wcout is not supposed to work properly, does this mean that no console applications can show, for example, text in Japanese? |
It's perfectly possible to output Japanese to a console in Windows. You just have to output Shift JIS and have the use set things up properly.
Unfortunately, if I ever catch you doing that, I'll stab you.
It's kind of a conflict. The Windows console is not designed for Unicode, but sending anything that isn't UCS is problematic. IMO, if you need Unicode, don't use the console; that's what GUIs are for. What I've been doing lately is sending, say, error messages to the console in UTF-8, and praying that the console understands it. If it does, great; if it doesn't, at least the user can redirect to a file and read it with a program that does understand UTF-8.
I don't see why it's such a big deal for C++. |
Because, technically speaking, Unicode is not portable. The basic character set has to be representable in chars, one character per char. That's why we have wchar_t (which has its own problems as well).
ANSI and code page support is just a legacy thing nowadays. |
ANSI isn't legacy! It's the lower 8 bits of UCS! Not only does that cover most of the West's needs (English, Spanish, Portuguese, French, Italian, German, and probably a few others. The first four alone are basically the entire American continent), but it can also easily be extended to UCS-2 or -4 if that's not enough.
I do agree that anything other than UCS is legacy, though.