optimize large nested for loops

I have a very large nested for loops, which is too much time to complete.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
ofstream file{"archive1.bin"};
text_oarchive oa{file};

struct pix{ int x; int y;}; 
unordered_map<string,string> obj;
vector<pix> obj_pixel;
for(auto p0 : obj_pixel) {
   for(auto p1: obj_pixel) {
      auto pv = ... some calculation ..
      for(auto p2 : obj_pixel {
      obj[function of p0,p1,p2,pv] = function of p0,p1,p2;
   }
}  
oa << obj;

Here the size of obj_pixel is in range of 0 to 30,000 which represents the pixels belonging to a same object eg. an apple or pencil. Here i have to calculate the key and value for the unorderd map obj in each loop and save it at last. In c++ is their any library to optimize this kind of tasks? What is the best method to do so.
Last edited on
> which is too much time to complete.
How much time is "too much"? Hours, days, months?

> Here the size of obj_pixel is in range of 0 to 30,000 which represents the pixels belonging to a same object
Sure, and 300003 for your three nested loops is 2.7×10¹³
Even if you manage 109 iterations per second (just a few instructions on your average desktop), that's 27000 seconds (~8 hours).

You could make sure all your auto loops like auto &&p0 : obj_pixel, and all your "function of" use (const) references as well.

But TBH, you need a better algorithm, not tweaks to a brute-force try everything.
Maybe if you explained the problem, not how to make your solution work, we could offer suggestions.
http://xyproblem.info/

> obj[function of p0,p1,p2,pv] = function of p0,p1,p2;
So not just a few instructions then.
Probably more like a few 1000.

> oa << obj;
How big is this file when you've created it?
Topic archived. No new replies allowed.