Hello
I am having this test code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
|
unsigned __stdcall thread_test(void *)
{
for(int i = 0; i < 10000; i++)
{
i+=1;
i-=1;
}//simulting processing
_endthreadex( 0 );
}
int main()
{
HANDLE hThread;
while(1)
{
getchar();
hThread = (HANDLE)_beginthreadex( NULL, 0, thread_test, 0, 0, NULL );
CloseHandle( hThread );
}
}
|
The problem with it is that it is filling up(leaking) the memory RAM with 4k every time I press ENTER. I am seeing it via task manager. However, i tested it up and the memory stops filling up when it reach around 133.000K.
After discovering that it stops at that amount, I though that the memory is not actually used/leaked, but I noticed that when the memory is 133.000, the program runs like 4-5 time slower than it did at the beginning (higher the memory - slower the program runs).
This is only a test program, but I want to make a multithreading server (i've made the processing and sending and all the other stuffs; right now, instead of making a thread, i am calling the function that process stuffs) and I am stuck at this part, because I don't want my server to run 5 times slower after 15-20 mins. The design that I wanna use is the same as in the code above: in function main there is a loop that waits for connections; when it got one, it creates a thread that will process that connection.
So the question is: how can I make it not to take up memory after it exits?
I compiled it with codeblocks and visual studio. On both the memory is filling up (the max limit is for codeblocks, i don't know if visual studio have different limit).
Please don't suggest me the thread class.