Returning 0 on main

I have learned quite a bit in the last few months doing C++ but there are two things, now that I read them in an actual book I was curious..because it didn't actually explain why..
1) Why do you need to do
1
2
3
int main() {

}

instead of
1
2
3
void main() {

}

2) Why should you always return 0 inside of a main function?
1) Because the latter is not standard. It's only supported by some compilers. The former compiles everywhere.
2) You shouldn't. You're supposed to return a non-zero value if the program fails. In any case, this only makes sense for some programs. For most programs, it doesn't matter what they return.
Thank you very much for the guidance. Helped put some things into clarification, thanks again.
main by default returns an int, some compilers also would allow:

const char* main()
{
return "Hello I ave the poorest of programmin styles";
}

but the behavior is undefined as the main function is supposed to return an int. It's like using math function that returns a double, but passing it straight into an int without casting, the compiler may allow it but it's behavior is undefined.

Last edited on
Thanks again.
I will keep that in mind as well.
but the behavior is undefined
It's not undefined. "Undefined" means that the standard leaves an implementation detail up to the compiler developers or the underlying hardware (e.g. order of evaluation, or what happens when a buffer is overflowed). The standard explicitly states that the return type of main() must be int.

It's like using math function that returns a double, but passing it straight into an int without casting, the compiler may allow it but it's behavior is undefined.
That's also not undefined. The standard has very well defined conversion rules for all the basic types. Implicitly converting a double to an int is perfectly legal.
I thought undefined just meant that the behavior is unknown. In the case of visual studio compiler, it seems to allow any type of main(), while others (such as GNU) don't...
If as you suggested the standard does not leave this detail up to the compiler developers then why is this allowed by the most common used IDE in the world?

Also why do compiler's (such as GNU) throw warnings about non-casting types if it's perfectly fine?
I thought undefined just meant that the behavior is unknown.
Some undefined behaviors, such as dereferencing an invalid pointer, are non-deterministic, but that's an issue with the platform. A platform where dereferencing an invalid pointer always causes Dvorak's 9th symphony to play isn't inconceivable. The behavior is still undefined, but on this hypothetical platform, it's deterministic.

If as you suggested the standard does not leave this detail up to the compiler developers then why is this allowed by the most common used IDE in the world?
That's called a compiler extension. I'm not sure why void main() is allowed. I suppose at some point before the standard came out, some compilers allowed it, and now it's there for backwards compatibility.

Also why do compiler's (such as GNU) throw warnings about non-casting types if it's perfectly fine?
Because some implicit conversions can cause subtle bugs if done recklessly. For example, casting from a signed type to a bigger unsigned type causes sign extension, but it's easy to forget about that and assume that the new bits will always be zeroed.
helios and Duoas, very interesting..
Last edited on
Thanks, for the feedback. I'm still a little unsure as to the exact context of the word however I just won't use it in future.
Thanks again, that was pretty helpful. I wasn't going to look back because I thought responses on this were dead, but I just read back through and gained a wealth of knowledge. Thanks everyone.
Topic archived. No new replies allowed.