Cannot Parse Large CSV Files

I found this snippet online and have been using it for years. It converts CSV files to string arrays. It works fine for "small" csv files, but will not convert larger ones. Large being 5000+ lines or ~750k size. It just exits the code without errors or string array. I wish to input larger files, how do I do that?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50

istream& operator >> ( istream& ins, record_t& record );
istream& operator >> ( istream& ins, data_t& general );


int main (int argc,char *argv[]) {

    ifstream decodefile;
    decodefile.open("myFile.csv");

    data_t general;
    decodefile >> general;
    decodefile.close();

 return 0;
}


istream& operator >> ( istream& ins, record_t& record )
  {
  record.clear();
  string line;
  getline( ins, line );

  stringstream ss( line );
  string field;
  while (getline( ss, field, ',' ))
    {
    stringstream fs( field );
    double f = 0.0;
    fs >> f;
    record.push_back( f );
    globalint++;
    }
  return ins;
  }

istream& operator >> ( istream& ins, data_t& general )
  {
  general.clear();
  int g=0;

  record_t record;
  while (ins >> record)
    {
      g++;
      general.push_back( record );
    }
  return ins;
  }
Last edited on
Have you recompiled it? If its been around for years, it could just have a smaller memory model? I don't see anything that would limit it here, at a glance.

the files you mention fit in memory effortlessly on a modern machine...
Last edited on
What is a record_t and data_t? It would be easier to see what if anything is wrong if you posted a small complete program that illustrates your problem.

Please post a small sample of your input file.
I recompiled the code to the latest g++ compiler. It did the trick! Thanks
Topic archived. No new replies allowed.