I check time of foo(). It I make a graph of the dependence of the average value on n I should get a parabola.
But now I have this result. Avarage grows up but in 7000 it falls. Why?:
Argument is 1000
Average time is 2.3 seconds
Standard deviation of time is 0.458258 seconds
Argument is 2000
Average time is 17.4 seconds
Standard deviation of time is 9.4784 seconds
Argument is 3000
Average time is 40 seconds
Standard deviation of time is 14.2548 seconds
Argument is 4000
Average time is 74.9 seconds
Standard deviation of time is 14.6045 seconds
Argument is 5000
Average time is 95.5 seconds
Standard deviation of time is 18.7949 seconds
Argument is 6000
Average time is 197 seconds
Standard deviation of time is 29.213 seconds
Argument is 7000
Average time is 153.3 seconds
Standard deviation of time is 35.1 seconds
Argument is 8000
Average time is 173.3 seconds
Standard deviation of time is 6.35689 seconds
Argument is 9000
Average time is 225.8 seconds
Standard deviation of time is 20.6824 seconds
Argument is 10000
Average time is 518.1 seconds
Standard deviation of time is 348.98 seconds
No, it does not "fall in 7000". From numbers alone, intuitively, the 6000 is an outlier.
Did you plot the results? Did you fit a curve to the plot? Which point(s) deviate most?
What if you run the program again?
Ask yourself: Can something else running in the same machine at the same time affect the results?
It's probably overflowing at high integers. So maybe change all your 'int's to 'long long's. The common theme in all of these threads is that I have no idea what's going on. Nobody knows what specifically is wrong, because it's not clear what is supposed to be happening.
> Ask yourself: Can something else running in the same machine at the same time affect the results?
clock() (which was used in the first code snip) is supposed to give you CPU time, so other process or i/o operations should have little effect on it
unless you are on windows, where they surprisingly managed to fuck it up and return wall clock time
> No, it does not "fall in 7000". From numbers alone, intuitively, the 6000 is an outlier.
>> Okey, but how can I fix it?
- increase the number of samples
- remove the outliers (alpha-trim)
- provide a clean environment (leave your experiment alone, don't start you 3d rendering)
- understand the error and deal with it
Fix what? If the results are affected by something that is not in your control -- outside of your code, then what could you do?
It's probably overflowing at high integers. So maybe change all your 'int's to 'long long's. The common theme in all of these threads is that I have no idea what's going on. Nobody knows what specifically is wrong, because it's not clear what is supposed to be happening.
Now I tried it in code blocks and it works correctly. Before I worked in visual studio.
I think that question is closed. Thank you very much)
> Ask yourself: Can something else running in the same machine at the same time affect the results?
clock() (which was used in the first code snip) is supposed to give you CPU time, so other process or i/o operations should have little effect on it
unless you are on windows, where they surprisingly managed to fuck it up and return wall clock time
> No, it does not "fall in 7000". From numbers alone, intuitively, the 6000 is an outlier.
>> Okey, but how can I fix it?
- increase the number of samples
- remove the outliers (alpha-trim)
- provide a clean environment (leave your experiment alone, don't start you 3d rendering)
- understand the error and deal with it