the stack limit is determined by some system level things. The OS probably has a maximum amount, the amount of ram on your machine matters (you can't ask for a GB on a 128MB system), and the compiler flags (the compiler can set the stack size within the other limits).
The stack size is limited because you want to be able to run many programs at once: if all 1000 of the programs running on your computer had all asked for 1GB up front (stack is UP FRONT, its allocated at program launch) you would need quite the machine! By limiting it, many programs can run at once without one or two of them stealing all the resources.
Its not a 'bad idea' so much as 'you can't do it past some point'. C++, unlike the vast majority of languages, does not have a lot of places where it says "you can't do that". This is one of those few places, an the reason, mostly, is the limits are imposed from on high and the language can't do much about this.
That said, I routinely read 4-5 GB files into the heap. memory allocation isnt that big a deal anymore. And even 'back in the day' it really wasnt as bad as people make it out to be; I had a 'dumb pointer' template class that simply called delete when it went out of scope, and used it on everything for years back then. It didn't do anything else, just constructor and destructor. Which is pretty much what the modern pointers do, I think. (Actually, It did one other thing, it prevented the user from leaking by reallocating it, because you could only allocate it at construction and could not re-assign the pointer to another block etc).
Having a vector of C++ strings will really chew up available memory fast. Memory mapping a 1GB file this way isn't likely to blow the stack or the heap.
it may not be a c_string. could be a char array meaning 'byte array' (binary file, for example). I don't use c++ strings for byte arrays (more out of habit than anything else). Don't want to forget and do text things to them.
memory fragmentation may prevent making one more than about 3/4 the size of your ram even if dynamic. The stack is a solid block, another reason the OS does not like making it too large.
Calling it a char array is a rather odd way IMO to say byte array, i.e. binary file format.
I used to do that almost exclusively. (well, char* allocated to file size, actually).
its what read(char*, size, ...) and write(char*, size, ..) expect, and [index] gets you to whatever byte offset in the data to extract fields.
what do you use? (could be one of the aliases for char, BYTE or something?).
On linux you can easily set the stack size to 'unlimited'. I think the biggest "danger" in doing so is that if you happen to accidentally write an infinite recursion (which is quite easy to do) your program will use more and more memory and probably get very slow because it starts swapping memory to disk (I hate when that happens).
Running out of heap can usually be handled gracefully though try/catch.
I've read/seen some opinions from "experts" that using exception handling is a design flaw in one's code.
Most of them are "If it's good enough for C then it's good enough for me" types that write C++ code that looks at best as being C++98.
For me, IMO, the people who do the language standardization believed it has a valid place, and I happen to agree.
I am by no means even remotely a C++ expert, having very badly self-taught myself, and continuing to do it with all the new-fangled changes C++11 and later standards have made.
It would be on the stack. Might as well just use a raw array.
As I was typing my question I did a quickie code snippet with VS 2019 and std::array. No errors when compiled, but it sure did whinge 1GB of stack memory was more than VS was willing to allow the program. 32bit or 64bit. It suggested using heap memory.
I've read/seen some opinions from "experts" that using exception handling is a design flaw in one's code.
Sure, if they can write magic code that can magically handle all possible situations and they don't want to have any kind of protection or handling against anything ever throwing :)
With the honorable exception of giving clear guidelines to very inexperienced beginners who don't want nuance but just want to know how to do it right for 99% of cases (for example simply telling someone with five minutes experience to stop using char arrays and just use a string, or that they should basically never use new), I take a very dubious view of any expert announcing that usage of <language_feature_X> is always bad. Every feature is just a tool in the toolbox; right tool in the right place at the right time.