Size limitations are particularly evident with integers. On some computers, integers can only be about 32,000, on others, they can go up to a little more than 2,000,000,000. What is the largest value your compiler will accept for a signed integer? What is the most negative allowed value?
I'm not very sure about this topic, but...
Well, try starting with a large number and see how large it could get. Try using something like 10.000.000.000, if it accepts it then go up to 13.000.000.000, if it doesn't, try something like 8.000.000.000. Then start searching for billion max. values, then million max. values, then with thousands, hundreds and finally get the exact maximum number you can use for your computer. Same with negative numbers, i think.