File get corrupted when > 2GB

Hi

I have a TCP socket service application running on NTFS Windows 2003 system. It was running fine for a year and suddenly it started showing me corrupted log files randomly. We have 7 servers and this problem is seen randomly on each of these servers but not everyday. We have one log file whose size reaches 4GB or so in a day but we get lots of time this file full of NULL character (ascii hex val=0) after collecting logs of say initial 10 hours.
More symptoms
- There is no specific size after which this problem occurs but generally seen after 2GB size.
- On same system, on some days we will get the file size without any corruption of 4 GB as well. So nothing like >2GB causes problem always.
- The problem is randomly seen on the different servers.
- We are using fwrite() to write to this file opened using fopen(w+t mode).

I wrote small C++ application and was able to create files of 8GBs also. Thus size is related but not directly having any issues..

I am lost with options and thus want to get more heads to ponder on this.. How can this be debugged further..

Thanks in Advance
DD

This is a broad description of the problem. I suggest, you debug the application and what you can do is put a break point where it write to the log file.

May be you can put a conditional break point for when it writes a null or when the variable's value is null during the write, break will hit and than you can debug and see what is causing the problem.

Putting a normal break point will not help as you cannot debug for 10 hours. Or may be you can put debug messages which can give you a hint. Narrow down the problem to specific part of the code which will make it easy to find the problem.
Topic archived. No new replies allowed.