Big O & time complexity ????????????

Salutations.................

can you help me how to calculate the Big O & time complexity for any algorithm and c++ program please???????????
Think about it first. show us your ideas
O is an indication of the algorithm behaviour. There isn't an exact calculation.
Last edited on
Neither of those links are particularly useful to someone wanting to know how to take an algorithm and
determine its runtime efficiency. They're both great for mathematicians though.

Big O is used to compute the worst-case "runtime efficiency" of an algorithm. I quote the "runtime efficiency"
because there isn't a single definition of it. For example, I might want to put an upper bound on the number
of memory accesses a particular function performs. Or I might want to put an upper bound on the amount
of memory a function uses.

In academia, usually you are asked to compute the number of times an algorithm loops or the number
of comparisons a sorting algorithm will perform. In any case, usually the input to the function is an
array (list, whatever) of N elements.

There is no magic to it. You simply have to count whatever it is you want to count -- memory accesses,
loops/iterations, memory used, whatever. Since the input to the function is an arbitrary number of
elements (I used N above), usually you can't arrive at an exact number, but you can come up with
a formula written in terms of N.

As an example, compute the number of loops (ie, number of times "new" is invoked below) in the following algorithm:

1
2
3
4
5
void foo( int* array, size_t N ) {
    for( size_t i = 0; i < N; ++i )          // Outer loop
        for( size_t j = 0; j < i; ++j )       // Inner loop
            new int[ 10 ];
}


Let's count the number of times new executes.

The first time we arrive at Inner loop, i == 0. j starts at zero, and j is not less than i, so the loop executes 0 times.

The second time we arrive at Inner loop, i == 1. j starts at zero, so the new executes. j is then incremented, and now j == 1, which is not less than i, so the new is not called again this time.

The third time we arrive at Inner loop, i == 2. You can easily figure out that new will be called twice this time.

The fourth time... you should be able to see the pattern.

1st time == 0 calls;
2nd time == 1;
3rd time == 2;
4th time == 3;
Nth time == N-1

So the total number of times new is called is 1 + 2 + 3 + ... + N - 1.

You should know that the sum of the first k integers is k(k+1)/2, so substituting N-1 for k, you get
(N-1)(N)/2 == ( N^2 - N ) / 2.

Now according to the rules of big-O notation, O(cN^2) == O(N^2) where c is a constant, so
we can immediately drop the 1/2 factor and make it N^2 - N. Again according to the rules
of big-O notation, if f(x) = g(x) + h(x), then O(f(x)) = g(x) if g(x) grows faster than h(x) or h(x)
if h(x) grows faster. In our case, N^2 grows faster than N, so our algorithm is O(N^2).

Topic archived. No new replies allowed.