Currently trying to make a program to output strings to a binary file, but because i'm using the <string> header - I've been led to assume - they are all of different sizes as they are dynamically allocated.
What I wanted to do is have an int value before each entry to specify the size (in bytes) of the next string object.
So this leaves me wondering, how do I get the size in bytes of a string. I've been told sizeof() doesn't work, and it didn't when i tried. Some people said that string.size() returned the "size" in bytes, but not whether it was the number of bytes in the string, or for that whole string object.
I may have a few things wrong here, but any help would be greatly appreciated! :)
The size() method returns the number of elements contained in the string, without counting the terminating null. If you are using std::string, that one uses 1 byte per element; if you are using std::wstring, then 2 bytes per element. A more generic approach would be to multiply size() by sizeof(std::string::value_type) or sizeof(std::wstring::value_type). Personally, I program for Windows and I am used to define tstring like this:
typedef std::basic_string<TCHAR> tstring;
This way I have a polymorphic string class that automatically switch between ANSI and Unicode depending on the project settings.
NOTE: I did not verify that the element type was in fact std::string::value_type. Check that out first.