Finding time complexity of a nested loop

Hello everyone! The outer for loop will run "n" times, then second for loop will run n^2 times if I am not wrong (because it's running 1 time, then 2 times, 3....so its following n(n+1)/2 sequence). I am not getting about the "if loop" and the last for a loop. I think the last for loop will also run n^2 times because it depends on j. Please correct me if I am wrong. Thanks

1
2
3
4
5
6
7
8
9
10
11
12
for (int i=0; i<n; i++)
{
for (int j=i; j<= i*2; j++)
   {
        if (j%i == 0)
      {
         for (int k=0; k<j; k++)
         printf("*");
      }
   } 

}
By n^2 and n, I mean the time complexity of that respected loop
When i is 0, you are effectively dividing by 0, so your program is invalid.

But let's assume i starts at 1 instead of 0, since that doesn't affect the big-O complexity of the problem anyway.

Your sequence closely resembles the following OEIS sequence:
A045943 Triangular matchstick numbers: a(n) = 3*n*(n+1)/2.
http://oeis.org/A045943

If you didn't have that if-statement there, then you'd essentially have 3 linear nested loops, giving you O(n3).

The if-statement is the key here. The inner loop would add another factor of n to your complexity, but that last loop only runs if j is divisible by i, or "i divides j".

So, when is j divisible by i? I'm sure this can be robustly shown using some modular number theory proof, but if we just think about it:
• i is some number between 0 and N-1.
• j goes from i to 2 N.

The only time when j % i will be 0 is when i == j (3%3 == 0), or when j == 2 * i (6%3 == 0).
[If you start with (i == j), then the next time j will be divisible by i is when j = i + i = 2 i]

Because these are only 2 numbers of out n^2 possibilities where this condition holds true (and there isn't something inside the condition that cause the complexity to blow up compared to what's outside the condition), it is insignificant and does not affect the overall asymptotic complexity of the sequence.

So the algorithm is O(n2).
Last edited on
This is a past paper question! What if you do not start i with 0..and start it with 1 ? That is definitely gonna give an error. So would we comment on the time complexity then?
https://www.youtube.com/watch?v=9TlHvipP5yA
fast forward to 4 :50 ! This guy told that the sequence : 1times, 2 times, 3 times is
n(n+1)/2 which gives us n^2 times.
So would we comment on the time complexity then?
I'm not sure what your question is. But regardless of i starting at 0 or 1, the asymptotic growth is still the same. f(n) ~ n^2. I was just saying that in practical matters, n % 0 is undefined behavior and is likely to crash your program.

Yes, the loop shown at 4:50 is O(n^2), assuming that the "stmt;" part of the code represents some O(1) set of instructions.
Last edited on
Considering that video, the overall complexity of below loops would be : n * (n^2) = n^3
Is that fine?
1
2
3
4
5
6
7
for (int i=0; i<n; i++)
{
for (int j=i; j<= i*2; j++)
{
print(“*”);
}
}
j goes from i to i * 2. But i itself is growing linearly with respect to n, so the j loop, being dependent on the range of i, is also growing linearly. Combining that (linear*linear), you get O(n^2).

example values:
n = 4, i = n/4 --> j goes from 1 to 2 (range = 1)
n = 4, i = n/2 --> j goes from 2 to 4 (range = 2)
n = 4, i = n ==> j goes from 4 to 8 (range = 4)

So as i doubles (n/4, n/2, n), the range that j needs to iterate over also doubles (1, 2, 4). Which means that as i increases linearly, the range that j needs to iterate over also increases linearly. So you essentially have two linear loops, and the overall complexity is O(n^2).

Sorry if I'm not clearly explaining it, as I tend to think of these things on a case-by-case basis. But in general, you need to look at how j is growing based on what it's dependent on, and then you need to know how fast the thing that it's dependent on is growing.
It often helps if you plug in some values of n and count the number of times the inner-most loop iterates. It's almost cheating, but if you're confused it helps a lot to get an idea of how the algorithm's no. of instructions grows with respect to n.

_____________________

Edit: Here's an easier way of thinking about that loop.
j starts at i in your inner loop, but this is equivalent (in terms of complexity) to subtracting i from both the initial condition of j and the terminating condition of j.

So for (int j = i; j <= i * 2; j++) is essentially the same as
for (int j = i - i; j <= i * 2 - i; j++)
which is for (int j = 0; j <= i; j++)
Now, your loop algorithm is:
1
2
3
4
5
6
7
for (int i = 0; i < n; i++)
{
    for (int j = 0; j <= i; j++)
    {
        print(“*”);
    }
}
and you already know that answer to that.
Last edited on
So i think that I have got where I was making a mistake. I thought that "Abdul Bari" was taking about the inner loop only for O(n^2). But it was for the whole code(BOTH LOOPS). Both the video code and code given below are behaving the same. So yes the time complexity of below code will be O(N^2). Am i RIGHT?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#include <iostream>
using namespace std;
int main()
{
	int n;
	cin >> n;
	for (int i = 0; i < n; i++)
	{
		cout << "Outer loop " << endl;
		for (int j = 0; j <= i; j++)
		{
			cout << "Inner loop " << endl << endl;
		}
	}
	system("pause");
}
Correct.
Thanks a lot! @Ganado
Sorry but I am confused again after seeing the below code:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#include<iostream>
using namespace std;
int main()
{
	int n;
	cin >> n;
	for (int i = 1; i <= n; i++)
	{
		cout << "outer " << endl;
		for (int j = 1; j <= i * i; j++)
		{
			cout << "Inner " << endl;
		}
	}
	system("pause");

}


If i analyze the above code just the way abdul bari did in the video(link is above).
The no. of times it is executing is 1 , 4, 9, 16..........So (n^2 times).
So if that is the case then the over all complexity should be n^2 because abdul bari also did the same thing. He analyzed how many times the inner loop is executing and linked it with series formulae (n(n-1)/2 = O(n^2). So the same thing should be followed here!
Consider:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
#include<iostream>
#include <cmath>
using namespace std;

int main()
{
	int n;
	cin >> n;

	for (int z = 1; z <= n; ++z) {
		int cnt = 0;

		for (int i = 1; i <= z; i++)
		{
			for (int j = 1; j <= i * i; j++)
			{
				++cnt;
			}
		}

		std::cout << z << "  "<< cnt << "  " << cnt/(z * z) << std::endl;
	}
}


giving for 20:


20
1  1  1
2  5  1
3  14  1
4  30  1
5  55  2
6  91  2
7  140  2
8  204  3
9  285  3
10  385  3
11  506  4
12  650  4
13  819  4
14  1015  5
15  1240  5
16  1496  5
17  1785  6
18  2109  6
19  2470  6
20  2870  7


There is a relation between n and the number of times the inner loop is executed. This relationship is k(n^2) where k = (n - 2) / 3 + 1 (using int arithmetic, except for n = 1).

This simplifies to (n^3) / 3 - (n^2) / 3. So for overall complexity it is O(n^3).

Last edited on
lost110,
then the over all complexity should be n^2

Let's make sure we're on the same page; when you say "overall complexity", I assume you are talking about the algorithm's runtime as a whole, from start to finish, with respect to n.

Your outer loop is linear. Hope we can all agree with that. Your inner loop goes from i to i * i, so it grows with the square of i, and i is in turn growing linearly with n.
So overall, the algorithm you have is n * (n * n), or O(n^3).

Yes, if you ignore the outer loop, the inner loop itself grows with the square of i (which is proportional to n), so if you're just talking about the inner loop, it's O(n^2).
Last edited on
Topic archived. No new replies allowed.