I am running a program remotely in Linux that outputs a lot to the screen. When I left this to run for a while, the machine locked up and I assume it ran out of memory. Is there a limit to how much I can output to the screen? If so, would piping the output to a file help?
0.0 OMG! 100 GB of RAM! I don't know much about remote computing. It depends where you are storing the data the program is outputting. Try restarting the machine? lol
/* Dont run the program, just an example. */
#include <iostream>
int main()
{
std::string programstate="Its working";
for(int i=0;i<20000;i++)
{
for(int j=0;j<5;j++)
{
std::cout<<programstate<<std::endl;
}
}
return(-1);
}
So, why would the system run out of memory? Sure, it would slow down, but it won't run out of memory.
Are you trying to do this over a terminal service (like putty?) > 100,000 lines being sent practically instantly over such a connection would make me suspect something funky happening with either the connection protocols, latency, or maybe even a security service running on remote box. A quick test would be to dump the reports to a file and then grab the file using conventional methods.
such a connection would make me suspect something funky happening with either the connection protocols, latency, or maybe even a security service running on remote box. A quick test would be to dump the reports to a file and then grab the file using conventional methods.
I'm using ssh to log into the machine so this could be possible.
My worry was that the output had to be stored in some sort of buffer, which could potentially become filled.