1 2 3 4 5 6 7 8 9 10 11 12 13 14
|
#include <iostream>
using namespace std;
void foo( int y[10] ) {
cerr << "PARAMETER = " << sizeof( y ) << " bytes " << endl;
}
int main() {
int x[10];
cerr << "AUTOMATIC VARIABLE = " << sizeof( x ) << " bytes " << endl;
foo( x );
}
|
AUTOMATIC VARIABLE = 40 bytes
PARAMETER = 8 bytes |
This result would be somewhat
surprising to most people, as suggested by the OP. Note that if I change y, I will also change x, as I have allocated no new memory. OTOH, if I had used vector<int> y as a parameter, I would get new memory so changing y would have
no effect on x.
I wonder if it is an artifact of the early C days where you could never push an array on a stack (as a parameter) because it would overflow. In that context, it is obvious that you would always pass a pointer and never an array.
These days, C++ could care less if you wanted to pass a large object onto the stack as a parameter - it would happily copy construct it for you. BTW, there is no absolute reason why the code should behave like the way it does above, except it has been set that way as a standard. I mean, the compiler clearly knows the sizeof() that chunk of memory at that memory address!!!
The bottom line? According to sizeof() and assignment-behavior, x is an array (has its own contiguous memory) but y is a pointer (refers to memory allocated elsewhere).