making a char print as to two digit hex

This is what I thought up to print a char as a two digit hex, is there an easier way/better way?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
#include<iostream>
#include<iomanip>
#include<sstream>
#include<vector>

using std::cout;

inline std::string hex_last_2(const char& c){
    std::stringstream ss;
    ss << std::setw(2) << std::setfill('0') << std::hex << int{c};
    return ss.str().substr(ss.str().size() -2, ss.str().size());
}


int main() {
    
    char c1{'@'};
    char c2{-1};
    char c3{-2};

    std::vector<char> vc = {c1,c2,c3};

    // I wrote this print the hex version of character
    for(auto& c : vc) cout << hex_last_2(c) << ' '; 
    cout << '\n';

    // This is what happens if I don't
    for(auto& c : vc) cout << std::hex << int{c} << ' '; 
    cout << '\n';

} 
Last edited on
Consider:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#include<iostream>
#include<iomanip>
#include<vector>

int main()
{
	char c1 {'@'};
	char c2 {-1};
	char c3 {-2};

	const std::vector<char> vc = {c1, c2, c3};

	for (const auto& c : vc)
		std::cout << "0x" << std::setw(2) << std::setfill('0') << std::hex << (int)(c & 0xff) << ' ';

	std::cout << '\n';
}


which shows


0x40 0xff 0xfe



The issue is that on most systems, char is signed. So that when it is promoted to int, the sign bit gets extended - hence all the ff's. Anding with oxff makes it unsigned. An alternative is:

 
std::cout << "0x" << std::setw(2) << std::setfill('0') << std::hex << (int)(unsigned char)c << ' ';


which explicitly casts c to an unsigned char before being promoted to an int.
Last edited on
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
#include <iostream>
#include <vector>
#include <iomanip>

int main()
{
   unsigned char c1 { '@' };

   // assigning a signed int (-1 is an int) to an unsigned char is a conversion
   // can't use uniform initialization without casting
   unsigned char c2 = -1 ;// you should get a warning there is a conversion mismatch

   unsigned char c3 { static_cast<unsigned char>(-12) };

   std::vector<unsigned char> vc { c1, c2, c3 };

   for (auto& c : vc)
   {
      // +c 'fools' std::cout c is a number, not a character
      // sign isn't changed
      std::cout << "0x" << std::hex << +c << ' ';
   }
   std::cout << '\n';
}
Last edited on
Thank you both, I'm not sure why I never considered changing char to unsigned char. It works and looks much cleaner.

One additional question for my edification: why does this change a char to an unsigned char?
c & 0xff
integer literals are of type int. So 0xff is actually 0x000000ff (for 32 bit integers). Anding this with a char (signed or unsigned) gives an int with the lower bits representing the char. Casting an int to an int doesn't involve any promotion - so the sign bit doesn't get extended. I was a bit lax with the explanation in my previous post.

You actually don't need the (int) cast:

 
std::cout << "0x" << std::setw(2) << std::setfill('0') << std::hex << (c & 0xff) << ' ';



Last edited on
Thank you, for your time, if was very helpful. I had do some thinking, but learned something, or at least re-remember something that I forgot about bitwise operators.
Topic archived. No new replies allowed.