how to enter a unicode point into std::string

hi,
I am starting programming with gtkmm cairomm and I must pass some text to a function that has the following signature

void Cairo::Context::show_text ( const std::string & utf8 )

The unicode points I want to enter are rather high values corresponding to musical symbols like 1D161 (š…”).

My IDEĀ editor (Eclipse under Linux) doesn't accept the "Ctrl+Shift u 1D161" way of entering the symbol thus I try to enter the unicode point with void show_text( "\x1D161" ) which I read is the way to enter a character by its unicode point, but that doesn't work.

Nonetheless this method works fine with rather low unicode points like \x61 (a) thus show_text( "\x61" ) display the a letter.

Moreover if I enter the symbod in OpenOffice Writer with Ctrl+Shift u1D161 and then copy and past it into my IDE editor to form show_text( "š…”" ) it works.

My question is how to enter a unicode point out of the Basic Multilingual Plane (high values) into an UTF-8 string.
The safest and most portable way would be to manually convert the codepoint to utf-8 and then put each individual byte in the string in "\xXX" form.

For instance, U+1D161 is represented in UTF-8 as F0 9D 85 A1 (I think! double check my work!) So you could do this: "\xF0\x9D\x85\xA1"

Or if your compiler allows for UTF-8 encoding, and your file is saved in UTF-8 encoding, you can just drop the codepoint directly in your code. (which is probably what is happening with the OpenOffice thing). But note that not all compilers like that, so doing that might cause portability issues.
Thank you for answering. your encoding is correct but unfortunately it doesn't work. I get a C letter.
Sorry I made a mistake: it gives am empty box.
I put the UTF-8 encoded bytes into a char[ ] then initialized the string with the array and it worked fine. That's rather tedious but that makes it.
And I found this http://www.ltg.ed.ac.uk/~richard/utf-8.html itbecomes less tedious.
Topic archived. No new replies allowed.