So, according to what I've been reading a reference is really just a high level convenience, and doesn't even get a place in memory (why we cant have arrays of references). That being said, why not just have a "reference" type, per se, and allow you to assign it to one thing. Is the data type just there for readability?
"I've been reading a reference is really just a high level convenience, and doesn't even get a place in memory"
They do and they don't; the choice is down to the implementation. Normally, however, references are a pointer in disguise, so, they do in fact allocate memory.
ResidentBiscuit wrote:
"(why we cant have arrays of references)"
That and because references must be initialised, and since there's no guarantee that an array is initialised, there's no guarantee that all the references will be initialised.
ResidentBiscuit wrote:
"That being said, why not just have a "reference" type, per se, and allow you to assign it to one thing."
That would be the exact same thing, and wouldn't offer nothing in terms of benefits over the basic reference type. Besides, that would be extra typing.
They do and they don't; the choice is down to the implementation
I had no idea this was implementation defined. I just thought the compiler pretty much stripped the reference away and replaced with it the same location as the object referred to.
But yea, I wasn't really arguing that it would be a better idea to drop the type of references, just curiosity why it was left in there when it really doesn't make a difference.
doesn't even get a place in memory (why we cant have arrays of references)
That's only partly the reason. Much more important is the fact that references are guaranteed to be valid. That guarantee can't be made if you allow arrays of references, since creating an array and initializing it are two distinct operations.
Plus, in most situations references do exist in memory.
That being said, why not just have a "reference" type, per se, and allow you to assign it to one thing.
Such a thing would have the drawback of pointers (possibility of invalidity) without their advantage (reassignability).
Yea, I realize this. But Helios seemed to be saying it would different if the data type was dropped and replaced with a reference type. I was trying to see how it would be different than it is now.
How would they even become invalid in any different way that would be possible now?
Maybe I misunderstood what you meant. You're saying you'd like to be able to do this, correct?
1 2 3 4
int a=0;
ref r;
f(r);
r=a;
At the call to f, r is invalid. It would have to somehow check that its parameter is valid. This is no different that with pointers, only r can be assigned only once.
If you want to force assignment at creation, that's exactly what C++ references are.
Book I read this is was written in '95, so I guess this has changed since then.
Compilers in 1995 were even less clever, so I doubt it. What you got was the theoretical version, which is alright, but books on languages like C and C++ should tell you about the practical possibilities as well.
Well, if a reference was implemented with a pointer, the underlying pointer would be invalidated in the same way as a reference - you would have to dereference the pointer to find out, though.
Think you did have me misunderstood. I'm not saying change the behavior of references at all, just asking why they need a type.
In this hypothetical world where there is a reference type, they would still need to be initialized at declaration.
but books on languages like C and C++ should tell you about the practical possibilities as well.
Being a book on memory management, I assumed the author had it right when it came to this :( I know 4 or 8 bytes isn't a huge deal, but if the main topic of a book is memory management, you'd think he would know how this actually happens.
void f(ref a,ref b){
a=b;
}
//a is an alias for whatever b is an alias to.
For the same reason pointers need a type
But when you deference a pointer, you're getting access to a variable amount of memory. You don't dereference references (ha, I hope I'm not the only one who found that amusing), and the compiler knows exactly how much memory the reference needs. But then again, this assumption is based on thinking the compiler doesn't just make a hidden pointer and call it a reference.
What you're asking for is polymorphic codesm, which will check for validity at the point of call and do various other things, and is already possible to do with templates. However, templates have special requirements of their own, such as that the implementation has to be visible at the point of call. You can't compile a template function at the function that calls it independently.
You don't dereference references
Yes, you do. Implicitly, anyway:
1 2 3 4
int f(int &r){
r=10; //Dereferencing.
return r; //Dereferencing.
}
Well, this would be invalid. But type-checking is already in place. I agree this sounds a whole like templates, which I didn't realize at all when I made this thread. Now I guess I'll do some research as to how those work exactly. Anyway, back to this.
Yes, you do. Implicitly, anyway:
This boils back down to references being just another name for the same object. Kind of like a typedef? But I guess this is not always the case. So what's the benefit to the compiler making a hidden pointer for my reference?
So what's the benefit to the compiler making a hidden pointer for my reference?
Guarantee of validity.
With a pointer, you can do this:
1 2 3 4 5
void f(int *p){
if (!p)
return;
//...
}
But if p is non-null yet still invalid (say, it points to a deleted portion of the heap), the function will most likely cause undefined behavior. This would be completely impossible if p was a reference. References are useful when you don't want to pass something by value but also always want the parameter to be there.
This boils back down to references being just another name for the same object.
Really they're pointers plus a bit of syntactic sugar.
This boils back down to references being just another name for the same object. Kind of like a typedef?
If references were indeed other names, like hardlinks on filesystem, then yes, distinct type category would be unnecessary.
"Another name" is a useful mental image, but it is not precise. A reference is another access path to the object. The difference is that the reference can prevent modifications (think reference to const) when used to access an otherwise perfectly modifiable object. Or it can make every access an observable side-effect (reference to volatile) when accessing an otherwise innocent non-volatile object. Or it can add both flavors.
To describe this access path, a distinct type needs to be added to the type system (actually two kinds of types, since rvalue references have different properties).
I'm jumping in a little late, but I think OP is looking for this:
auto& myref = whatever;
As for why 'auto' is needed at all... you need some kind of type name to identify the statement as a variable declaration. Simply having & myref = whatever; would be too confusing, since & might be the address-of operator, or the bitwise and operator.