Im working on a database program that reads CSV and or tab delimited text files. Ive always used datagridviews with ease but i am having some issues now that im working with large files and searching for specific combinations of data.
The current program im working on deals with larger files. E.g. current text files have over 20,000 rows of data with somewhere between 5 to 12 columns. These files will increase in size as the year goes on so im expecting 100,000 or more by year's end, probably more.
My program searches for occurrences of data. Some of these searches need to match several pieces of data as a combination of sorts based on the program's user's interest. Consequently the program's load time is taking a time that is no longer practical.
At the moment i have all the data loading into various datgridviews in the formload event. Data is fed into the datagridview line by line using StreamReader, ReadLine Into strings. Each string (line) is split based on the delimiter into an array<string> using Split. This array is then placed across a row of the datagridview. Then the process is repeated using a loop until the end of the file is reached.
CAn anyone suggest some pathways i could take to improve the speed at which my handling / searching of data is done?
I assume that you use C++/CLI so I would suggest that you use the OleDB classes. Here is an article how to do it. It's in C# but it should not be too difficult to translate to C++/CLI.
The error message I get stated:
"An unhandled exception of type 'System.Data.OleDb.OleDbException' occurred in System.Data.dll
Additional information: 'C:\Support Services\Data\test.csv' is not a valid path. Make sure that the path name is spelled correctly and that you are connected to the server on which the file resides."
Can anyone see where I'm going wrong? The file certainly exists!