Side one basically argues that the "!!" idiom is something any good programmer should understand and is common in programming.
Side two basically argues that the "!!" idiom is just a cheap trick and should be explicitly made clear by using the latter code example above.
Which side is ultimately correct? I wouldn't mind using this idiom or seeing it once in a while, but I also agree that making the code clearer is better in the long run. However, the "!!" idiom doesn't seem that hard to understand and might take at most two seconds for a programmer to realize.
I am with second side. Implicit bool→int conversion is an C remnant and I wold like to see it removed (not that it will ever happens).
I must note that nonzero = integer; is equivalent to nonzero = !!integer;, so I do not understand it's use: both examples are similary obscure, but first is easier to type. Also semantic: !!x = x by De Morgan's laws.
Also in second example you know that something is compared with zero.
In first you cannot be sure what is happens: logical negate operator is often overloaded to test validity of objects.
Interesting points. As to your second paragraph, this is what one commenter said under the OP's question:
Martin V Lowis wrote:
After the double negation, the value is guaranteed to be 0 or 1; the original value might be any int value.
By De Morgan's law, the negation of the Boolean value false(0) would become its Boolean conjunction, true(1)? I think if the integer is nonzero when negated, the information of its original value is already lost.
I don't understand what happens "under the hood" well enough to know if this is true or not.
Overloaded versions of the logical not operator is definitely something that would destroy the interpretation of "!!". Thanks for the example situation.
When we are assigning it to bool we do not need it to be 0 or 1.
Assigning 0 to bool: false; Double negated 0:0, assigned to bool: false.
Assingning non-zero to bool: true; Double negated non-zero: 1, assigned to bool: true.
!! will not change anything. And probably for built-in types will not affect code generation (if you have good compiler, else it will generate slower code).
It might be useful in C where int was used as boolean value and some functions were expecting 0 or 1 and breaks if something other were passed. But not in C++ with its own bool type.
First of all C has no type bool. Even C's type _Bool is not a boolean type. It is unsigned int type. However type _Bool was introduced in C99. There is no such a type in C89.
Sometimes it is important to have exactly values 1 and 0 for expression with logical meanings. So !! is used to generate such values for any expression. Consider the following code
enum bool { false, true };
You can not write for example
number & 0xF == true
because number & 0xF can be equal to any non-zero value from 1 to 15. So you can write
Am I the only one who abhors comparisons to true and false?
x == true ⇔ x
x == false ⇔ !x
What the eff is up with comparing to Boolean constants?
As a matter of personal taste, I also dislike explicit comparisons to zero, so I always do things like if (!!string && !strcmp(string, "literal"))(I would forgo the !! if some compilers didn't generate warnings.)
I wasn't aware that VC++ 9.0 and above was considered prehistoric. I believe I've also seen it on GCC. I can never trigger it when I want to, though; dammit.
As a former assembly programmer I confidently use statements like:
1 2 3 4 5
int x=3;
while(x){
// do stuff
x--;
}
because I know what happens inside the CPU. A CPU doesn't know a 'bool' datatype or 'true' or 'false' and especially for a compare with zero there's a special flag in the CPU so it doesn't even has to do a comparison. So technically a while(x!=0) is slower, but the optimizer will drop that anyway.
BTW: Why do you guys think all the comparison logic in C is like "do until FALSE" instead of "do until TRUE"? Because it's very easy and fast for the CPU to do a X==0 check.
BTW: Why do you guys think all the comparison logic in C is like "do until FALSE" instead of "do until TRUE"? Because it's very easy and fast for the CPU to do a X==0 check.
Both if(x==0) and if(x!=0) would most likely compiled to:
1 2
test eax,eax
jz label
i.e. compiler would change if (x!=0) {dosomething} to if(x==0) {} else (doSomething)
Or it possibly will use jnz. Either way is no faster than other.
Compilers are way better in low-level optimizations than human. If you have a decent compiler you will rarely need to do instruction-level optimizations. You should concentrate on algorithm-level ones.
Edit. I have nothing against implicit pointer-to-bool conversions. It is well known idiom:
1 2
if (p)
doSomethingWithP;
and I see it as overloaded "is_valid" operator for complex types.
And in some rare cases integer types in simple stuff like while(x) (But not while(x && /*...*/), use while(x==0 && /*...*/) here! ).
Compiler will optimize those simple operations, so there is little to no increase in speed and great loss of readability.
while(x!=0 *facepalm* I need to check what I write.
Thing is, you cannot be sure what is meaning of x in while(x). Is it int and it loops while it isn't equal to 0? Is it some class with bool conversion defined which means validity? Or is it uses some other implicit conversion and language quirks?
You cannot be sure until you find definition of x or example of manipulation with it. So there is loss of readability in code.
you cannot be sure what is meaning of x in while(x)
My question was about why you allow while (x) but not while(x && ...)? Is the latter somehow less explicit about what x is than the former?
Thing is, you cannot be sure what is meaning of x in while(x). Is it int and it loops while it isn't equal to 0? Is it some class with bool conversion defined which means validity? Or is it uses some other implicit conversion and language quirks?
This seems rather flaky. If you have no idea at all what something is, how can you assume that any operation you care to think of will be valid, let alone that (bool)x and x != 0 will be equivalent?
Saying this is like saying "I prefer to cycle to work because someone could steal my Lamborghini", when in fact you're a lettuce.
I don't get the analogy. But anyway simply looking at while(x) doesn't tell you much about what you're doing, you would have to look over other part of the code to find what x is and what it means to be converted to bool.