I am not an expert by any means (just to ensure that you and anyone replying afterwards know that I acknowledge that there are likely better options than what I'm suggesting and that my terminology calls may be way off).
The simplest way that I can think of is to create two integer variables and define them as the characters (it should get converted to the ASCII value) then use an if statement to see which is higher and display the lower character first (or just varify it if you don't want it displayed)
I tested on an old compiler, I don't know if defining an int as a char will work on newer compilers though.
I think Duoas had the right idea although he phrased it in a strange manner. You just use the greater/less than signs ('<' and '>'). A character appearing later in the alphabet will have a greater value. My program wasn't working and I wasn't sure I was doing this part correctly so I wanted to double check my thinking with the people on this board. I found the error I made in my program and it pertained to something irrelevant to this topic.