Help using 0xDEADBEEF in a memory pool problem

Jun 10, 2013 at 3:50pm
Hello,

I am working on a problem in which the an integer pointer is initialized to 0xDEADBEEF. It is allocated memory in my memory pool and then deallocated later.
I read docs on this magic value number, but i am not sure how to use it.
Please help!

Thanks,
M
Jun 10, 2013 at 8:02pm
Have you tried grilling it? That's my favorite way!
Jun 11, 2013 at 12:25am
What o/s, compiler, IDE?

And how are you seeing the deadbeef?

And what docs?

Andy
Last edited on Jun 11, 2013 at 12:30am
Jun 11, 2013 at 2:21am
@mukulabdagiri

I don't know if it helps, but the hex value of '0xDEADBEEF' in decimal, is 3,735,928,559
Jun 11, 2013 at 8:57am
The point of initialising memory to 0xDEADBEEF is that it's a string of characters that's immediately noticeable when looking at the hex values of memory. If you're using a debugger to look at the contents of memory, it should be easy to notice the letters "DEADBEEF", and you'll know that there's some uninitialised memory there.
Jun 11, 2013 at 11:02am
do not use it. In the release version it's uninitialized memory
Topic archived. No new replies allowed.