Would an expression involve a const (or a defined variable) always be evaluated?

Hi all,

I am new to C++, please forgive me if I have used inaccurate terminology in the question.

Let me post some simple codes to illustrate better my question.

-----

# define a 0.0

# include<iostream>

int main(){


double b;

switch (a==0.0) {

case 0: {
b = 0.01/a;
break;
}

case 1: {
b = 999;
break;
}

}

return(0);

}

----
The intention is that if a==0, b will be equal to 999. and if a!=0, b will be equal to 0.01/a.

Since I have defined a==0.0, ideally only the expression b=999 would be evaluated/compiled. But it turned out the compile (g++) said it was invalid to divide 0.01 by zero which means the expression 0.01/a would still be evaluated EVEN I have used the switch.

I tried to used a if-else statement OR define const double a=0.0 within the main, the problem is still there.

I guess I am missing some fundamental concepts in C++.

Thank you in advance for the help.

Yes, the compiler says that, BUT it's just a warning and the program can run even without it. I tried it, and as long as you don't include iostream it works. Iostream includes some other header that doesn't allow something in there, i dont know what.
Hi vilml, thanks for the reply.

Does it mean an expression involve a const (or a defined variable) always be evaluated in compile time?

I think iostream is necessaryl in general?
It isnt necessary, you can do console input stuff with scanf and printf.
Why not just do...
1
2
3
4
5
if(a!= 0.0)
  b = 0.1/a;
else
  b = 999;
//.... 
Hi clanmjc,

This indeed gives same error in my compiler as long as "a" is a const or defined variable.
I guess what I'm not understanding is why you are declaring 'a' as const or define and trying to switch on it as if it were going to change?
As I would like to make sure "a" wont be changed in the execution of whole program.
More to the point - why do you want to divide anything by zero..?
The source code that you presented to the compiler (after the preprocessor pass) was b = 0.01/0.0;.

The right part of it is a constant expression "0.01/0.0". Compilers can, and probably all of them do evaluate all constant expressions at compile time, in order to create more efficient machine code. The code that is actually compiled assigns the special value INFINITY to b.

In this case, this particular optimization may be turned off if fenv_access is enabled, but almost no compilers support that.

PS: using #define for such purposes is bad practice
Last edited on
Hi Cubbi,

Thanks for the reply. It clarifies things a lot.

Topic archived. No new replies allowed.