What does int actually represent?

I'm not exactly a beginner when it comes to C. I've spent more than two years working with it. But I am unsure as to what int actually refers to. I've always assumed it's an integer the same size as a pointer, but I looked in stdint.h for one reason or another and I noticed that there are no #ifdefs checking for 16/32/64-bit, just straight out:

1
2
3
4
5
6
7
8
typedef signed char int8_t;
typedef unsigned char   uint8_t;
typedef short  int16_t;
typedef unsigned short  uint16_t;
typedef int  int32_t;
typedef unsigned   uint32_t;
typedef long long  int64_t;
typedef unsigned long long   uint64_t;


So this has me a bit confused; this appears to assume that int is always going to be 32-bit regardless of architecture. Is this true?
No, it's not.
When using a compiler for which these types have different sizes, there will be different typedefs in cstdint.
So what does define int? Is it just whatever the writer of the compiler felt like doing or does it have some form of pattern or design?

If it doesn't represent the an integer the same size as a pointer, what does? (for byte offsets inside allocations)
I think int is supposed to map onto a standard CPU register. So however big the general purpose integer CPU registers are is likely to be your int size. However that is not enforced in the standard. So it really is up to the compiler designer.
Ah, that's great, thanks. I guess size_t would be a good (if odd) datatype to use as an offset inside an allocation as you can't allocate anything bigger (iirc) using malloc/realloc. (pointers require constant typecasting to work with in the way I am)
Last edited on
Topic archived. No new replies allowed.