Document Classification - NLP with 300,000 Ngrams from 200 documents resulting in 50 million frequency values

I am having memory problems in Mac when trying to run computation intensive deep learning algorithms for NLP. My memory is getting full and the process is getting killed. I found that Python int size is generally 24 byte whereas C++,Java or C# int size is very less (4 bytes)? Why the size of datatype is large in Python? Is C,C++ languages better in these scenarios?
Why the size of datatype is large in Python?
Because in Python it is an object where additional information is stored. Note that python ints can store very large number whereas the 4 bytes are limited. See:

http://www.cplusplus.com/reference/climits/

Is C,C++ languages better in these scenarios?
Yes, it depends on how well it is implemented.
Topic archived. No new replies allowed.