Hello I am trying to load a large dictionary file into a hashtable for a spell checker.I am getting a segfault when word_count is 6318. I ran a backtrace in gdb and it is segfaulting on the marked line. I am not sure why. The text file is has 143091 words each on a single line ending with a '\n'. Thank you
> node *new_node = malloc( sizeof(new_node) );
'newnode' is a pointer sizeof(new_node) would give you the size of a pointer, not of a 'node'. If that happens to differ, then you would be trying to access memory that was not reserved.
You may use valgrind to check it.
That makes since to me I thought it should be sizeof(node) but I was told on another forum to use the variable name not the type name. I will try that thank you for your reply.