|1. can we know here how much it is limited size-wise, for example the RAM is 4 Gigabyte, the stack is 2G? or there a minimum or maximum value of it?|
There is no standard way of getting or setting the stack size.
On Linux I can get the default stack size by running "ulimit -s" from the terminal. For me it's 8 MB. I believe it's smaller on Windows.
It is possible to increase the stack size but I haven't had the need to do so. I have only played around with it a little, and I know that at least on Linux it is possible to set the stack size to "unlimited" but that has some consequences such as running out of RAM within seconds if I accidentally happen to write an infinite recursion.
|2. you mean here I need to check if the user input is valid, that is, it is not a negative, or letter.....?|
In a real program you should always do that. Never trust input from the user.
I guess when dealing with dynamically allocated memory it can be OK to just let the program crash if you run out of memory because it usually doesn't happen and the consequences are often not serious. But here you're dealing with the stack which has a very limited size so the chances you run into problems with a too large array size is much larger. I don't know if all implementations guard against stack overflow with VLA (anyone knows?) but if they don't that could potentially become a security hole that a hacker might be able to exploit.
|3. I would like to know every single of those, so that I can turn them off. I already did the VLA feature.|
I think -pedantic
goes a long way to warn about non-standard features. If you want to turn them into errors rather than warnings you can use -pedantic-errors
Note that GCC defaults to a "GNU dialect" of C++.
GCC 12 defaults to -std=gnu++17
If you want to use standard C++17 you should instead use -std=c++17
|4. Do you mean I can do this:|
I think Microsoft's compiler allow
but GCC and Clang don't (at least not on Linux). I wonder what compiler allows
|5. I cannot seem to build an example here|
s.value = 0x12345678;
std::cout << std::hex;
for (std::uint8_t byte : s.bytes)
std::cout << int(byte) << "\n";
This program writes a value (0x12345678) to one member of the struct (value) and reads from another member (bytes).
For me, on a little-endian machine, this prints
Many compilers guarantee this to work but according to the C++ standard this is technically undefined behaviour.