Is it correct to use std::array<std::vector<int>, ?> in this situation?

Hi everyone,
I wanted to know which is the best way of dealing with the following issue:

I want to store a data structure that has n elements (where n is an user defined variable and it is fixed during program execution).

Every of the n elements above mentioned is another data structure with variable number of elements (They are composed of int values, and the number of int variables inside each element varies during program execution).

We can have a structure like this

element 1: 3,4,52
element 2: 1,3,4,5,3,5,4

element n: 234,5,4,232,3,2,2,1,3,445,23

Which is the best way to define this kind of structure?

I was thinking in std::array<std::vector<int>, ?> but the “?” denotes a problem in which I don't know the fixed number of elements (n) in advance.

Thanks
If n is not known in advance, use a vector of vectors rather than an array of vectors.
Hi cire, thanks for your reply
but, taking into account that n is fixed, wouldn't vector representation waste some space or access time?
My question: "Why is that important?" The answer to yours: "It depends."

More cogent: a std::vector fulfills the requirements of the container whereas a std::array does not.
Last edited on
...wouldn't vector representation waste some space or access time?

Both containers provide constant time random access, and occupy O(n) space.

Vectors guarantee insertion in amortized constant time.

The vector manages that by increasing its capacity by progressively larger amounts as required. AFAIR, the capacity doubles when the vector runs out of space.

You can call std::vector::resize() to change the capacity, and eliminate extra storage by calling std::vector::shrink_to_fit().

Are you actually handling that much data that you need to care about the small bit of extra space?
Last edited on
Excellent people!

mbozzi: I don't think increasing the capacity could be really influential in the performance, but that is something I have to evaluate better.

cire: if there is no option, I have to use vectors then.

Thank you very much people!

Best,

Dario
I don't think increasing the capacity could be really influential in the performance, but that is something I have to evaluate better.


It might not effect the performance so much in time, usually, but perhaps more in space. Changing the capacity takes linear time (because it requires a copy of all the elements to a larger block of memory), you'll have at most (n/2) elements that don't contain data, assuming that the vector capacity doubles when required. std::vector::reserve/shrink_to_fit might help if that's an issue.

Note: Sorry - the right function is reserve, not resize.
Thank you mbozzi, that is a important point
Topic archived. No new replies allowed.