I was trying to send the size of a buffer over to a server and came across difficulty along with a crappy solution of hardcoding the size as '\x10' (lol). I didn't try too long before giving up. I figured mySS << std::hex << sizeof(myBuf); would have given the value of sizeof myBuf in hexadecimal but alas, it doesn't. It converts it characters and inserts them as, well... characters.
Problem:
I need a solution to turn integers into hexadecimal cleanly via C++.
I think kbw is correct. It sounds like you just want to send a length. Does the protocol you're using want the size of the buffer as a string representation of a hexadecimal?
mySS << std::hex << number
This will output the number as a STRING of hex digits.
hardcoding the size as '\x10'
Are doing this because the size field is only one byte wide? If that is the case then you could try (unsignedchar)(number & 0xff)
I need a solution to turn integers into hexadecimal
0x10 == 16 == 0001_000 - it's all the same... a number is a number for the computer... the representation is just something the programmer uses... to the machine it's all 1's and 0's.
Perhaps providing a larger snippet of the code you are using and maybe some details on what server / protocol / program you are trying to communicate with would be helpful?
When you input an integer to a sstream such as mySS << myInt; it places a string representation of the number (as according to an ascii table). What I'm really wanting is the true value of the integer in hex. So for instance, the sizeof (myBuf) is equal to 16. I'd like to send the hex 0x10. This code is rather sloppy but bare with me:
std::stringstream s(std::stringstream::in | std::stringstream::out | std::stringstream::binary);
s << '\x02'; //Header packet.
if (pClient->Send((const uint8*)s.str().c_str(), s.str().size()) == -1)
returnfalse;
std::stringstream ss(std::stringstream::in | std::stringstream::out | std::stringstream::binary);
char randStr[16];
gen_random(randStr, sizeof(randStr)); //Generate hash.
std::cout << randStr << std::endl;
//I'm wanting somethin done here so I don't have to hard code this size since in the protocol, randStr isn't neccessarily 16 but could be various sizes.
ss << '\x00';
ss << '\x10'; //This is the sizeof randStr in hex.
ss << randStr; //The authentication token.
if (pClient->Send((const uint8*)ss.str().c_str(), ss.str().size()) != -1)
returntrue;
elsereturnfalse;
And actually, the size of the field is two bytes ( 16 bit or a short ) but I keep getting strange results on alternate solutions. Its not actually required to change the size of the generated hash, so I don't think I am but I'd like to know how to accomplish what I couldn't figure out.
Better reformed question:
How do you send integer values without sending its string? For instance, I want to be able to send a short with the value of 10 and look at my packet analyzer and be able to see "00 10".
So instead of converting the short to "16", I'd like it to stay as "0x00 0x10"
Hex is only meaningful to strings. It is simply one way to represent a number as a string. Raw numbers are always in binary.
What you can do is take the address of your number and cast that to a char*
1 2 3 4 5
int num = 0x1A30;
const uint8* buf = reinterpret_cast<const uint8*>(&num);
pClient->Send(buf, sizeof(num));
The only fly in this ointment is sending the number in the correct byte order. On Linux you can use the system function htons() and htonl(), but I don't know what you would use on Windows.
Say for instance, I did this: outbuf.append(reinterpret_cast<const char*>(&hash_size));
This will append the what I'm wanting, the true value of hash_size to the packet or buffer. But when you view it in a packet analyzer, you see 0x00... one byte... when the protocol requires that I have two bytes, I get frustrated. I'm wanting to achieve 0x00 0x10 which says 16 within two bytes of hex.
I'm not sure I understand what you're saying, so I'll spell out what I understand.
1. You have a 4 byte integer, that has the values 00 00 00 10.
2. You re-interpret that sequence as a null terminated string, an empty string.
3. You write that empty string down the network and is only seeing the terminating null, 00.
4. But what you want to see is the four byte integer value 00 00 00 10.
Does that match what you've done?
If that's the case, follow helios' or kooth's advice above.
I wish I wasn't completely retarded in regard to this stuff. beej's only explains how to use the actual API, not how to manipulate packets, common practices, etc(). Let me give a hopefully more useful point of my view:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
char buf[64]; //What size is my buf supposed to normally be? Should I make it huge and just reuse it as much as possible?
while (true) //Lets assume, I've already called socket, bind, listen, etc().
{
recieve(clientfd, buf, sizeof(buf), 0); //I do error checking normally... for e.g. sake.
switch (buf[0]) //buf[0] normally holds packet header to determine packet type
{
case'\x01':
{
//Now, I assume this packet holds data. It should contain a string, prefixed with two bytes determining its size.
//My question is how do I *properly* grab the prefix and string and turn them into data types?
}
}
}
I've done a few methods but I keep getting called "sloppy" which absolutely pisses me off. For instance, how do I turn two different bytes (buf[2] + buf [1]) into a short data type?
So close... I know I can already do this but my biggest confusion is how to combine bytes. A short in this protocol is 2 bytes so the short would be contained in buf[1] and buf[2]. And how to do the reverse, taking a data type, splitting it into two bytes, and sending it.
kbw's solution produces undefined behavior. It assumes sizeof(short)==2 and that the endianness of the input to be the same as the system's. A better solution would be (typedef unsigned char uchar;) (uchar(buf[1])<<8)|uchar(buf[2]) if the input is big endian, or uchar(buf[1])|(uchar(buf[2])<<8) if it's little endian. Note that if the type of buf[0] is char, the explicit cast to unsigned char are not optional. Doing bitwise arithmetic, particularly shifts, using signed types is a recipe for disaster. Alternatively, you could define buf as unsigned char.