JLBorges wrote: |
---|
No. std::vector<> would be the only 'proper' way if an upper bound on the number of students is not available. |
Just wondering why you said that. Could I indulge you to educate me a little here? Hopefully the OP won't mind - could be good for him/her too.
Edit:
Also I think I have an understanding of how things work and should be used, then a very experienced person like yourself comes along and says the opposite!! The following is how I understand it at the moment :
If I had 2000 items, a
std::list
would be OK, but need to keep in mind whether there is any sorting or finding, inserting.
I could use a
std::set
or
std::map
, especially if there is lots of data, because these use AVL trees internally (does
std::set
use a BST?), so finding is more efficient. Need to be aware of the cost of any rebalancing of the tree that might happen, or if the data is already sorted.
I am just wary of
std::vector
because of it's inefficiencies when it gets resized. I am aware that it has a strategy of keeping it's capacity larger than what is required to mitigate this, but I can't help thinking that it is better to avoid reallocating memory altogether. It's especially bad for a vector of vectors. For these reasons, I have been avoiding vectors when there is no known upper bound, and is why I wanted to query you about this.
I suppose that a vector would be OK, if it was always going to be small and / or the size doesn't change very often.
I guess if there is only 500- 1000 items of data, then it probably won't matter a damn which container one uses.
Were these last 2 comments what you meant?