I've had programs segfault at the very end before, and it was always solved by adding/modifying a destructor of one of my classes. However, a program I'm working on right now crashes (most likely a segfault) when the program ends, but none of my classes use any dynamic memory at all, just a bunch of vectors. With no dynamic memory, there shouldn't be a reason for any custom destructors.
My program is too big to dump all here, so I was just wondering if anyone knew of any other reasons for a program to crash/segfault after it's finished, or if there are other reasons to need destructors.
I'm compiling it using Visual Studio, and it involves reading a file if that's relevant.
Check array accesses and loop bounds (especially strided loops) for potential buffer overflows. Check your use of raw pointers and potentially dangling references.
Are you using smart pointers to enforce ownership semantics? If not, make sure resource ownership is consistent. Are you (for instance) explicitly closing your file twice?
Compile with all warnings on -wextra -wall -pedantic-errors. Make sure you are generating code with some stack-smashing protection -fstack-protector.
You may find the Valgrind suite useful, if you're running a Unix-like machine. (It's pretty indispensable.)
Non-default destructors should be written whenever the Rule of Three (Rule of Five, nowadays) applies, or when your class manages a resource. Releasing that resource on destruction is idiomatic (poorly named Resource Acquisition Is Initialization or RAII) and is one of C++'s most important "killer features". (A "resource" is not always memory.)