Hi,
I have a set of files with the following format of data:
File 1
A B C
1 1 2
2 1 2.2
3 1 3.2
4 3 4.1
5 3 4
6 3 4
File 2
A B C
1 5 6.1
2 3 6.2
3 4 6.1
4 5 6.2
I want to know the best way of reading the files and putting the data in an array[][][].
The problem is that i dont know the amount of data in each file so the size of the different arrays should depend on the # of lines of the file.
Thankss
Although i dont understand vectors (this is a great opportunity to learn) i think i get the idea.
But, what if I wish to do this with simple arrays so as to keep it as quickly as i could. Is it possible?
In other words, a file has three columns. First row has single character in each column and other lines have int,int,double.
Is that correct?
Use std::vector. It can grow dynamically.
Call vector's reserve() first with "educated guess".
Use push_back() or emplace_back() to add elements to the vector.
A custom struct or std::tuple as vector's value_type. Either could store numbers of one row.
If you do have a statically allocated array, then it has to be big enough for file of any reasonable size. If typical file has much less lines, then you waste memory. (Furhermore, those arrays are in stack memory and stack is of "limited size".)
If you do allocate plain array dynamically, then you have to also deallocate is appropriately and trust us when we say that you rather learn the std::vector first.
will do. Vectors first then.
I was thinking of using a syscommand to find out the lines of the files and then allocate arrays statically.
Is there a syscommand to find out the lines of a file?