Hi,
I know the Ubuntu developers must be right. Have they ever been wrong? I shouldn't try to enable core dump as they didn't give any possibility to enable it...
However, some stubborn part of me seems to think that for some circumstances it would be really really helpful if i could just have a core dump if my program crashed...
So, that said, the only way to permanently enable a core dump for my program that I have found is the following, which worked fine yesterday, and crashes today:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
|
#include "hasher.h"
#include "Settings.h"
#include <QtGui>
#include <QApplication>
#include <iostream>
#include <sys/resource.h>
int main( int argc, char* argv[] )
{
rlimit* limit;
limit->rlim_cur = 50000;
setrlimit( RLIMIT_CORE, limit );
getrlimit( RLIMIT_CORE, limit );
//std::cout << "core limit: " << limit->rlim_cur << std::endl;
QApplication app(argc, argv);
Settings::setPath( QSettings::IniFormat, QSettings::UserScope, QCoreApplication::applicationDirPath() );
Hasher window;
window.show();
app.connect(&app, SIGNAL(lastWindowClosed()), &app, SLOT(quit()));
return app.exec();
}
|
Now, what is the beauty of errors like this? Well, if i uncomment the cout line, everything works fine. If there is no more cout, we crash. I wonder who invents things like this? Has all software been written by monty python???
As it happens, as this is an always bug, i have been able to enable core dumping for the session trough the shell with ulimit -c 50000, and obtained a core dump of this crash. This is what gdb says:
Program terminated with signal 11, Segmentation fault.
#0 0x080719c0 in main (argc=1, argv=0xbfe4be74) at main.cpp:13
13 limit->rlim_cur = 50000;
If anybody has a clue, it will be my pleasure to here about it...