Maximum Size of dynamic array

Apr 8, 2008 at 8:43am
Hi

I was wondering what the restrictions are on allocating a one dimensional (multidimensional) array like this:

p=new double[n];

What determines the maximum n? Is there a syntetic limit like 65k or so, or is it determined by the cache (level 1, 2 or 3) on my CPU or the RAM or even just RAM + swap?

I never really found an answer to this question.

Thanks for your help
Ceearem
Apr 8, 2008 at 9:17am
closed account (z05DSL3A)
If I remember corectly the 'n' in 'p=new double[n];' would be of type std::size_t so that would be your limiting factor (assuming you have enough memory).
Apr 8, 2008 at 9:43am
size_t is an unsigned int so basically this would mean n is something like 4 billion (assuming 32bit int)?

Apr 8, 2008 at 11:11am
closed account (z05DSL3A)
I think I miss read your original post as what is the 'type' for n.

I have never really thought to much about the limiting factors for dynamic storage. I would guess that, as dynamic storage is assigned on the heap then the size of your heap would have a limiting effect.
Apr 8, 2008 at 12:43pm
> size_t is an unsigned int so basically this would mean n is something like 4 billion (assuming 32bit int)?

This depends on the architecture, on 64bit computers the max value of size_t is basically "unlimited".

However, it's very unlikely that such an allocation would succeed. There are other, system-specific factors that determine the maximum size of a dynamic array. But it's probably a lot more than the cache size. But probably less than the total amount of memory. It might also be possible to configure the maximum amount of memory available to a process.

BTW: Of course in your example, you need n*8 bytes of memory to allocate double[n]
Topic archived. No new replies allowed.