Hello,
I am not sure if this is the place to ask.
I develop an algorithm in C++ under Windows XP with Visual studio 2008,
the algorithm is proved to be linear ( O(n) ),
there is a usage of memory during the algorithm ( all of it is being allocated dynamically before the algorithm starts )
and I measure the time it took to complete the computation and I observed the following -
1. using 4 MB of memory take X time of running.
2. using 40 MB of memory, only 4 MB is used, but it is scattered through all of the 40 MB,
( e.g. I mean that I can use any part of the 40 MB allocated, but guaranteed to use only 4 MB )
and this takes 1.3X time of running.
Does anyone have an explanation to this ? or references to read ?
maybe the amount of page faults involved ?
Thanks,
Omri