One of my students handed me this snip of code and asked why does it core dump after entering 8 numbers. I know the obvious answer is to use new to create an array at run time. Question I have is, why did it allow any numbers to be assigned to the array called kb_array since it is defined to have 0 elements.
int n;
int sum;
int kb_entries;
int kb_num;
n = 0;
float kb_array[n];
cout << "How many numbers to be inputted from keyboard?" << '\n';
cin >> kb_num;
kb_entries = kb_num;
cout << "Begin with 1st number" << '\n';
while (n < kb_entries)
{
cin >> kb_num;
kb_array[n] = kb_num;
cout << kb_array[n] << '\n';
n = n + 1;
if (n < kb_entries)
{
cout << "Enter another number from the keyboard" << '\n';
}
}
It shouldn't even compile since n is a non-constant value. In addition, the compiler won't allow an array of zero elements. An array of zero elements doesn't even make sense.
GCC has a language extension that allows creating variable-sized stack arrays.
why did it allow any numbers to be assigned to the array called kb_array since it is defined to have 0 elements.
While it may seem obvious to you, the compiler can't know that n has the value 0 at line 7. It merely generates code that will create an array on the stack of that size.
Suppose this code was part of a threaded application:
On line 4, some thread could have changed the value of n. Had the compiler assumed that n didn't change simply because it didn't change in this line of execute, the semantics would have been broken.
You could argue that much of what is happening here is not happening in that other code, but the fact is that it's hard to add enough cleverness to a compiler to recognize when certain assumptions are safe to make.
I don't see why the compiler should choke on an array of length 0.
(apart from the fact that it is completely useless)
You asked for and you got.
C++, like good old C has no array bounds checking.
An array is just a pointer to the beginning of a chunk of memory and you can go on past the end of that
chunk putting stuff in and stomping on whatever essential system stuff there might actually be there.
How soon you get a segfault depends on how critical the stuff your stomping on is.
You just got "lucky" : the 8 number thing is not reproducable
I agree on all of the above. To more directly answer your question: Since C style arrays have no bounds check, kb_array[n] will blindly take the value given to it, even when the memory pointed to by that expression (in pointer math: kb_array + n) is not safely allocated for data use. At this point it is a matter of luck: Have you corrupted the stack? Maybe, maybe not. You may get a segfault, you may not get it at all.
Now personal opinion: Dynamic-sized stack arrays are evil. Stick to the standard, which is static-sized arrays in the stack and dynamic-sized arrays in the heap.