Array/Vector Accessing large index numbers errors

I am working on a project involving a join algorithm (similar to a SQL join, coded in C/C++).

I have tried both arrays and vectors as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
        cout << "UINT_MAX = " << UINT_MAX << endl;
        cout << max << " is the calculated max" << endl;
        vector<bool> exists (max, false);
        inFile.open("tempVFile");

        exists[5] = false;
        cout << "Flag -1" << endl;
        exists[292915] = false;
        cout << "Flag0" << endl;
        exists[292916] = false;
        cout << "Flag1" << endl;
        exists[292917] = false;
        cout << "Flag2" << endl;


Above is vectors. Here is the output when compiled and run:
UINT_MAX = 4294967295
4294967295 is the calculated max
Flag -1
Segmentation fault (core dumped)

Below is Arrays.

1
2
3
4
5
6
7
8
9
10
        bool exists[max];
        cout << "UINT_MAX = " << UINT_MAX << endl;
        cout << max << " is the calculated max" << endl;
        for(unsigned int i = 0; i < max; i++)
        {
                exists[i] = false;
                if (i > 292910)
                cout << i << " is done" << endl;
        }
cout << "Post read" << endl;


And the produced output:
UINT_MAX = 4294967295
4294967295 is the calculated max
292911 is done
292912 is done
292913 is done
292914 is done
292915 is done
Segmentation fault (core dumped)



As seen, both core dump when trying to access a larger index. The gdb symbolic debugger does not tell me much (but I am by no means an expert in using it.)

(gdb) run
UINT_MAX = 4294967295
4294967295 is the calculated max
Flag -1

Program received signal SIGSEGV, Segmentation fault.
0x0805295c in std::_Bit_reference::operator= (this=0x80477c4, __x=false) at stl_bvector.h:87
87 *_M_p &= ~_M_mask;
(gdb) backtrace
#0 0x0805295c in std::_Bit_reference::operator= (this=0x80477c4, __x=false) at stl_bvector.h:87
#1 0x080521b1 in main () at join2.cpp:46


So if anyone could point me in the correct direction on an answer to this (why it occurs and/or how to fix it) I would be very appreciative.

Thanks!!



an array of 4294967295 bytes would certainly exceed the stack size. I mean you want 4 giga bytes memory. That should be a problem...
Thanks for the tip.

Changed stack size to unlimited:
limit stacksize unlimited

And am now currently running the process again. It has not seg faulted yet. Will give you an update once it finishes, but it looks like this is solved.

Thanks again.

Now it is getting stuck at 134,510,656 for some odd reason. My loop hits that number then loops back to 134,510,600. And since unsigned int max > that it shouldn't be a limits error, as well as my stack size being unlimited. Any thoughts this time? :/

Another run through later showed the amount it gets stuck at is 134,510,688 before looping back to 134,510,601. This would lead me to believe it is an enviromental attribute causing this.

Running on Unix server, if you need more details let me know what and I can get it for you.
Last edited on
Problem solved.

For future reference, somewhere between the for loop and using it to initialize the values of the array, it still uses the stack. So to get around it, I can declare the array as a global variable.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
#include <iostream>
using namespace std;

bool myArray[INT_MAX];

int main()
{
  for (unsigned int i = 0; i < INT_MAX; i++)
  {
     myArray[i] = 0;
  }

  return 0;
}


This will avoid the problems found above. Again, thanks for the pointer.
Topic archived. No new replies allowed.