I am new to c++ but have a lot of programming experience in Java. In c++ the size of int can be variable depending on the compiler and library. It has to be at least as large as short and the maximum limit is at long. So there is no guarantee that it is 4 bytes.
My question is if I am developing an application in windows and using windows libraries, how could I be sure that when I run my code in Linux it behaves as I expect and the size of int and other data types remain same?
On the platform you are running, you can check the value of the INT_MAX constant and many others which are in <climits> or <limits.h> header. Or, you can check the sizeof(datatype). Hope this help!
how can an one type of int be faster than another type?
It may be faster to use the machine word, rather than using smaller units for integral types. It's implementation defined, and those types are in stdint.h and apparently now in C++.