NULL is a macro for 0, it is used for pointers.
'\0' is 0 but it is a char, not an int
"" is a character sequence containing only a '\0' (Notice that a character sequence is not a character)
So NULL is exactly the same as 0, '\0' has the same value but a different type.
You can't initialize a string with an integral value.
The compiler differentiates between NULL and 0 on occasion, but they are the same when you get down to it. Storing NULL in a sequence of bits makes them all 0. Storing 0 in a sequence of bits makes them all 0 as well.
NULL == 000000... == 0
I think a good way of looking at it is: NULL is for pointers, 0 (or 0.0) is for numbers, and '\0' is for text (characters and strings). Otherwise they're exactly the same :)