How to check duplicates from a regex result?
Nov 28, 2012 at 9:31pm UTC
Hello, as my nickname states I'm a noob in C++
I'm parsing results from a file with regex and i get some duplicates, how i can check duplicates and only take 1 ?
Example of code:
1 2
std::tr1::match_results<std::string::const_iterator> res;
std::tr1::regex_search(buf, res, rx);
I don't know how to proceed with it...
Thanks in advance <3
Nov 28, 2012 at 10:56pm UTC
Any help appreciated please, I meant how i can delete duplicates from the result array.
I can print them using a loop:
1 2 3 4 5 6
for ( int i=0; i<res.size(); ++i)
{
std::cout << res[i] << std::endl;
}
How will be for deleting duplicates before ? I'm very noob, need help, everything i test won't work...
EDIT:
It should work with this:
1 2 3 4 5 6 7
std::sort(res.begin(), res.end());
for (int i = 0; i < res.size() - 1; i++) {
if (res[i] == res[i + 1]) {
res.erase(res.at(i))
i--;
}
}
but it tells me res has no member erase and .at too :(
Last edited on Nov 28, 2012 at 11:08pm UTC
Nov 28, 2012 at 11:14pm UTC
it's unclear why you'd want to remove some of the submatches from a match set and what kind of regular expression would need that. Perhaps you really needed to use a regex iterator?
Describe the problem more fully, what is the input? What is the regular expression?
Nov 28, 2012 at 11:46pm UTC
First, sorry for being a noob, this should be easy but it's making me trouble.
It's a simple regex for email format, it work and i can print results, but shows duplicates... I don't understand you well... If i change this
std::tr1::match_results<std::string::const_iterator> res;
it won't work.
I tried with this:
std::unique_copy (res.begin(), res.end(), std::ostream_iterator<std::string>(std::cout,"\n" ));
It prints results but still the duplicates , what I'm doing wrong?
Last edited on Nov 28, 2012 at 11:54pm UTC
Nov 29, 2012 at 5:32am UTC
Somebody give some hint please, i don't know...Still reading to try figure it out, but no luck yet.
Forgot to say buf is a string:
std::string buf;
Topic archived. No new replies allowed.