Maximum array size?

int arr[n];

What can be the max size of n?

And for that matter, what can be the max length of a string? If I want to open a 1G file and store it's contents in one string, would that be possible (of course, keeping in mind I have 4G of total memory...)

And I know OS assigns every program a limited memory. How can I take control of memory assignment? For example, in the above case maybe the OS assigns a few MBs to my program, but I need a little above 1G... Is that possible?
int arr[n];

What can be the max size of n?


Depends on the allocated stack size. I'm not sure how this can be configured. It might be a compiler setting or it might be a native API call.

In any event, I wouldn't make n too large.

And for that matter, what can be the max length of a string? If I want to open a 1G file and store it's contents in one string, would that be possible (of course, keeping in mind I have 4G of total memory...)


AFAIK string needs contiguous memory, so it can be a large as the largest piece of contiguous memory (virtual or otherwise) available. This depends on the machine and how RAM is alloted.

And I know OS assigns every program a limited memory. How can I take control of memory assignment? For example, in the above case maybe the OS assigns a few MBs to my program, but I need a little above 1G... Is that possible?


I don't think the OS does that. To my knowledge programs can consume as much memory as they want, as long as it's available. This is why an entire computer can feel the effects of a single program leaking memory.

Even if you run out of physical RAM, the OS may be able to allocate virtual RAM (possibly by using HD space as RAM). It's slow, but if you're desparate for memory....
I don't think the OS does that.

Actually, it is the OS that is responsible for memory management. A program may request and recieve access to memory from the OS, but that is usually handled by the system's libc or libc++. Fine memory management is handled by the libc/libc++ -- that is, the standard library may request a large chunk of memory and divide it up as needed by your program.

C++ allows you to moderate memory management via custom allocators (you'll see them as template arguments to STL containers), giving you control over as much or as little of this process as you like.

The maximum size of a string, etc, is the maximum size chunk of memory you can get from the OS.

Hope this helps.
closed account (4Gb4jE8b)
@ all of the posts above mine, including OP: thank you for asking, this has taught me quite a bit about OS level memory allocation

well i can honestly say i don't know for strings and arrays (well if Disch is to be believed, which i see no reason he shouldn't be, i do now know strings have the ability to be as large as they want to be) BUT i have been fond of using stringstreams for my dynamic memory problems and they can hold quite a few characters. I've never done anything in the Gig range but i have stored over 300 kilobytes in one variable.

And before i get people yelling at me for my lack of ability to code, it was for parsing a text file and it was an easy solution to a rather annoying problem, don't judge :P.
I don't think the OS does that. To my knowledge programs can consume as much memory as they want, as long as it's available. This is why an entire computer can feel the effects of a single program leaking memory.

That is true, but AFAIK, only in the case of kernel modules... I have had a course on Computer Architecture and I think we were taught that the kernel is supposed to assign memory to programs, or in tech language, working in kernel space, you can have as much memory as you want (not exceeding the system abilities, though) and when you are in user mode, you only get what the kernel allocates for you.

Actually, it is the OS that is responsible for memory management. A program may request and recieve access to memory from the OS

I tend to agree with Douas.

And in addition to these comments, I would like to add that I did end up giving it a shot, so I did this.

1
2
3
4
5
6
7
8
9
10
#include <cmath>
#include <iostream>
using namespace std;

int main(void)
{
       double n = pow(2,30);             // n = 1G
       char arr[n];                              // size(arr) = 1GB
       return 0;
}


The program compiled fine but when I executed the program it crashed. :( I assume the program couldn't get such a large memory chunk from operating system on such short notice.
Would anyone elaborate?
what compiler do you use? The visual C++ compiler doesn't allow such a statement. you're trying to get 1 GB from the stack which certainly crashes.

From the Microsoft site:
This [compiler] option [/F] sets the program stack size in bytes. Without this option the stack size defaults to 1 MB.
With a file so large, it would be wise to keep it open and stream it in when the next part is needed. Otherwise, load a chunk, process it, free up the memory, and load the next chunk in.

Its always good not to blow a load of memory on buffering an entire file, that's why Microsoft is moving in the direction of using streams for everything, uses a lot less memory.
what compiler do you use?

MingW GCC... The IDE is Code::Blocks. I started using Linux before I started programming. So I prefer the Windows fork of the Linux default C compilers instead of Microsoft Visual C++. I have my own reasons.

From the Microsoft site:
[quote]This [compiler] option [/F] sets the program stack size in bytes. Without this option the stack size defaults to 1 MB.
[/quote]
Can anyone tell me if that is possible with GCC?

With a file so large, it would be wise to keep it open and stream it in when the next part is needed. Otherwise, load a chunk, process it, free up the memory, and load the next chunk in.

I know that. I was just curious. Nonetheless, your point is duly noted.

Its always good not to blow a load of memory on buffering an entire file, that's why Microsoft is moving in the direction of using streams for everything, uses a lot less memory.

Uses a lot less memory, true. But waste of processing power, I assume. What is the use of a Intel i7 Quad Core with 2-threads for each core, if I am gonna have to process a file like I am using a 80286.

And I am not actually trying to make a program which consumes 1GB memory. I am just experimenting. I myself believe the maximum memory that a program consumes should not exceed more than its size on hard disk (except for the exceptions).
Topic archived. No new replies allowed.