Please share your thoughts and ideas!!!
I have thousands of files (.txt) which I have to parse and get bits out of them
(e.g. 5th bit of each file is 0|1?).Anyway of dealing such lot files to analysis ???
1 2 3 4 5 6 7 8
int main(int argc, constchar* argv[]) {
for (int i = 0; i < argc; ++i) {
std::cout << argv[i] << std::endl;
}
std::cin.get();
return 0;
}
I used above to push bits of each file into a vector, but it requires thousands of vectors to be defined.
providing any hint of reading and dealing with such a big number of files will be very helpful. Please shed some lights on it!!
Thanks in advance for your attention.
Thanks Athar.
Sorry i did not make it clear, I need all the bits out of all the files to deal with(e.g. from 1st bit to 9999th bit). Reading from folder seems to be one solution, but I could not do it even after following some descriptions. I will very appreciate of your further attentions.
If you're looking to implementing a cross-platform, lightning fast file analysis program and have need of a little extra muscle than the standard library provides, I would strongly recommend looking into boost::filesystem.
// Going from memory here, I haven't tried to compile it.
// Keep in mind, error-checking is excluded
#include <boost/filesystem.hpp>
int main(int argc, constchar* argv[])
{
boost::filesystem::path p = boost::filesystem::system_complete(boost::filesystem::path(argv[1]));
boost::filesystem::directory_iterator end;
for(boost::filesystem::directory_iterator iter(p); iter != end; ++iter)
{
// Process File Here
}
}