Hello guys :),
I have a binary file whose size is 80 GB. I need to read it in chunks, but as I know, the pointer of fstream is of type long int, which is definitely not enough for pointing to any point after 20 GB.
I used to use memory-mapped files. But I'm already extremely sick of it. The same code just poses different problems on every operating system:
Ubuntu: Unable to map, simply the file can't be opened at some point.
Mint: Unable to map, same Ubuntu's problem.
Fedora: Segmentation fault that doesn't make sense at all, debugging lead to dead-end.
Cent OS: Works fine, but I can't install it on all computers I have.
I posted the problem under this link:
http://www.cplusplus.com/forum/general/48971/
and I achieved the result, that it's IMPOSSIBLE to have a stable program with memory-mapped files when dealing with huge files...
So now I'm looking for an alternative that supports 64-bit integers to point in file?
Thank you for any efforts :)