Question about time complexity

Pages: 123
zapshe wrote:
If you don't loop or have recursion, you're always going to have O(1). The specific number of operations you go through per n value doesn't matter.

You could call a function or use an operator that claims to be O(n) but you don't know anything about how it's implemented.

You could have some spaghetti code that jumps all over the place with goto and doesn't resemble a loop.

This is mostly theoretical. Thinking about it in terms of iterations and recursions is less abstract and often works in practice. But does that mean we should jump to the conclusion that all time complexities have been derived this way and then assume things about the constants that are not necessarily true?

zapshe wrote:
THIS seems to be a misinterpretation of Big O. For example:
1
2
3
4
5
6
for(int i = 0; i < size; i++)
{
    //1 line of work;
    //another line of work;
    //a 3rd line of work;
}

Assuming n=size, each line in the loop is executed n number of times.

If we count each line in the loop as a primitive operation that means there are 3n primitive operations in total (ignoring loop overhead).

zapshe wrote:
This is NOT O(3n), this is O(n).

Technically it's both, because O(3n) and O(n) means the same thing.

Personally I don't see why O(3n) would come from the above code any less than the following code:
1
2
3
4
5
6
7
8
9
10
11
12
for(int i = 0; i < size; i++)
{
    //1 line of work;
}
for(int i = 0; i < size; i++)
{
    //another line of work;
}
for(int i = 0; i < size; i++)
{
    //a 3rd line of work;
}

After all, they do roughly the same amount of work so why shouldn't they be treated the same? In practice they might not perform the same but from the Big O time complexity we cannot tell so it's just misleading to treat them differently.

zapshe wrote:
You are NOT saying you're doing 3 lines of code per n value, you are saying you are going through ONE SET of instructions per n value.

I'm not doing any of that.

zapshe wrote:
1
2
3
4
for(int i = 0; i < size * 2; i++)
{
    //some work..
}

This would be O(2n), simplified to O(n).

Are you more OK with simplifying this than the code with multiple loops?

zapshe wrote:
Though you can clearly see it would do twice the amount of looping ...

If we don't care about the looping, just how much work it's doing, then you can double the amount of work by either doing twice as many iterations or doing twice as much per iteration.

stackoverflow wrote:
O(n) means, in a cycle of n iterations, for every n there is only a single algorithmical step to be taken
zapshe wrote:
For every n, there's a single algorithmical step, not a single operation - but the combination of operations that make up your algorithm.

I'm not sure how accurate or comprehensive it is to express it like that, but by all means, group them. That is essentially what removing the constants is all about.

When you have two sections of code with the same time complexity, one after the other, you can always group them.

1
2
x = 8; // O(1)
++x; // O(1) 
Together they are still O(1)

1
2
for(int i = 0; i < n; i++) { ++x; } // O(n) 
for(int i = 0; i < n; i++) { ++y; } // O(n)  
Together they are still O(n)

EDIT: You can also group two sections of code with different time complexities but then you only keep the bigger (dominant) term.

1
2
x = 8; // O(1)
for(int i = 0; i < n; i++) { ++x; } // O(n)  
Together they are O(n)
Last edited on
You could call a function or use an operator that claims to be O(n) but you don't know anything about how it's implemented.

Hence why Big-O is a measure of asymptotic time complexity and not actual time or algorithm logistics.

However, there is a clear correlation between more looping and more time. Algorithms that loop less are more likely to not take as long as algorithms that loop more and accomplish the same goal. This is by no means a rule, but correlation.

Personally I don't see why O(3n) would come from the above code any less than the following code

An argument can easily be made that the 3 loops still constitute O(n) even by my standards due to the fact that all 3 loops run a single line from the same algorithm. The fact that you can combine them simply shows that each loop does 1/3 of the whole algorithm - hence O(n) anyway.

It isn't unnoticed how artificial and bad of an example you have to make to even try to show something that can prove your point.

If we don't care about the looping, just how much work it's doing

This is a nonsensical sentence. If you care how much work is being done, you care about the loops. Unless you have a very poorly made algorithm, an algorithmic step from two different algorithms will likely take a similar amount of time. But if one loops 2x more than the other, you've got a slower algorithm.

Big-O, even by YOUR definition, would not be able to differentiate between algorithms in the way you're saying. One algorithm may be terribly slow but loop O(n) while another is really faster but loops O(n^2).

Why do you care if you loop n or n^2 times? You don't know the algorithm. The number of loops is worthless.


That is essentially what removing the constants is all about.

Somehow, you managed to not understand.



My algorithms professor (my 2nd algorithms class) was old enough that he could die at any moment. But he was very knowledgeable, explained time complexities, and taught very well. There was never a point where he mentioned the specific workings of one algorithm vs another when talking about time complexity.

The reason: it doesn't matter. You cannot have a reliable system of algorithm comparisons based on how many operations. However, there is a clear correlation between the number of loops and how long an algorithm will take - hence the birth of time complexity.


I don't know what there is left to say. Your notion of time complexity is wrong. And your arguments against what I'm saying apply to YOUR OWN idea of how to evaluate Big-O.
Last edited on
zapshe wrote:
Hence why Big-O is a measure of asymptotic time complexity and not actual time or algorithm logistics.

So why are you so focused on counting loops then?

Don't misunderstand me. I'm terrible at describing this. I don't mean you can use it to know how much time it will take to run, but it says something about how the time changes as the input size grows towards infinity.

zapshe wrote:
It isn't unnoticed how artificial and bad of an example you have to make to even try to show something that can prove your point.

Maybe you don't understand my point. Or maybe I don't care that it's "artificial" because it shows my point.

I just tried to demonstrate that those constants are not reliable to make useful judgements about which algorithm is faster (since they're only part of the whole picture). I'm talking in general. If you know the details about the algorithms and you know they do similar amount of work on each iteration then it could of course make sense to look at the number of loop iterations but you don't need to use Big O for that.

zapshe wrote:
there is a clear correlation between more looping and more time. Algorithms that loop less are more likely to not take as long as algorithms that loop more and accomplish the same goal. This is by no means a rule, but correlation.

Yes, but there is also a correlation between more code inside the loops and more time.

Peter87 wrote:
If we don't care about the looping, just how much work it's doing
zapshe wrote:
This is a nonsensical sentence. If you care how much work is being done, you care about the loops

I don't care that they happen to be loops specifically. It would just be an implementation detail.

zapshe wrote:
Unless you have a very poorly made algorithm, an algorithmic step from two different algorithms will likely take a similar amount of time.

Have anything to back up this claim? I think the radix-sort example you posted earlier is a quite good counterexample because the choice of radix can have quite a big impact on the amount of work that has to be done in each "step".

zapshe wrote:
Big-O, even by YOUR definition, would not be able to differentiate between algorithms in the way you're saying. One algorithm may be terribly slow but loop O(n) while another is really faster but loops O(n^2).

No, but I think it's a relatively good hint that will be accurate most of the time if n is just big enough, "unless you have a very poorly made algorithm". But please don't let us argue over this particular point. My opinon has never been that Big O is this great "tool" that allows me do awesome things. It's a tool that tells me some things, but not everything. There are other tools as well that might be better suited for the task.

zapshe wrote:
Why do you care if you loop n or n^2 times? You don't know the algorithm. The number of loops is worthless.

Sometimes I feel like you're talking against yourself...

There are two goals floating around here and it's important to keep them a part when answering what is important.

Do you want to know the Big O? In that case you can often do that by counting loops, etc. (but the exact number is unimportant).

Do you want to know how long time it will take to run for some particular n? Then you should probably run and measure (either by using a timer or a profiler).

zapshe wrote:
My algorithms professor (my 2nd algorithms class) was old enough that he could die at any moment. But he was very knowledgeable, explained time complexities, and taught very well. There was never a point where he mentioned the specific workings of one algorithm vs another when talking about time complexity.

Sounds like he is much more knowledgable than I am. You should probably trust him more than me.

zapshe wrote:
The reason: it doesn't matter. You cannot have a reliable system of algorithm comparisons based on how many operations. However, there is a clear correlation between the number of loops and how long an algorithm will take - hence the birth of time complexity.

I don't remember exactly how I was taught, but I remember they counted operations at some point ...
I guess there are different angles to approach it from.
It doesn't matter because those constants don't mean anything inside the Big O notation. They might mean something outside the Big O notation, but not inside.

zapshe wrote:
I don't know what there is left to say.

You're right, we should probably try and round off this discussion. It feels like I'm starting to repeat myself.
Last edited on
Peter87 wrote:
I don't remember exactly how I was taught, but I remember they counted operations at some point ...

I found some slides from a "data structures and algorithms" course at the same school that I went to. I have probably not seen these exact slides before because I studied there a few years earlier but still think there is some interesting quotes that might explain where I'm coming from.

Page 1 wrote:
We will measure the amount of work the algorithm does in terms of “elementary operations”.
They are assumed to require one time unit
Complexity is a relative concept, only interesting together with the corresponding elementary operation.
Page 4 wrote:
Strategy for complexity calculations, quick rep.
1) Set up the requirements
, principally which elementary operations (EO) you use i.e. what cost are you calculating. (and possibly what you are NOT counting)
2) Set up the formula for the complexity rather exactly (mathematically correct) and motivate carefully what you do.
3) Solve the formula ...

Slides: http://www.cse.chalmers.se/edu/year/2018/course/TDA416_Datastrukturer/mtrl/lectures/F4.2.pdf
Last edited on
Sometimes I feel like you're talking against yourself...

It was sarcasm, hinting that if you don't care if the algorithm loops 1000*n times, why do you care if it loops n^2 times.

From your provided link:

We will measure the amount of work the algorithm does in terms of “elementary operations”. They are assumed to require one time unit Complexity is a relative concept, only interesting together

The English in the slides isn't great, but this may have been making the same point I am. All elementary operations reduce to O(1).

From https://www.cs.utexas.edu/~mitra/csSpring2017/cs313/lectures/algo.html#:~:text=An%20elementary%20operation%20is%20one,and%20not%20the%20exact%20time.

We analyze an algorithm in terms of the number of elementary operations that are involved. An elementary operation is one whose execution time is bounded by a constant for a particular machine and programming language. Thus within a multiplicative constant it is the number of elementary operations executed that matters in the analysis and not the exact time.

Since the exact time for an elementary operation is unimportant, we say, that an elementary operation can be executed at unit cost.


In other words, you wouldn't be counting the number of operations at all. Any part of a loop or recursive call that is constant time is simply reduced to O(1) and discarded.

For example, would this be O(3n) or O(n + 3)?:

1
2
3
4
5
6
for(int i = 0; i < n; i++)
{
   //1 line of work;
   //another line of work;
   //3rd line of work;
}


If you'd call this O(3n), then what would this one be?:

1
2
3
4
for(int i = 0; i < n * 3; i++)
{
   //work..
}


Clearly, O(3n). However, both of these examples cannot be O(3n). The work in the first example's loop is reduced to O(1) as it is constant work per loop.

But in the second example, the very number if loops is affected. The fact that it is affected by a constant is the only reason it's removed.

The first one, as I've understood Big-O, would be O(n). I'd be as inclined to get rid of any constant for that example as you are.

It will not be as useful and knowing O(100n) vs O(n), where you know that one algorithm loops 100 times per n rather than just n times.


We could have probably solved world hunger by now.
Last edited on
zapshe wrote:
We could have probably solved world hunger by now.

Good idea. I'll try that, starting with myself. See you around!
<3
Big-O Complexity Chart:

https://github.com/gibsjose/cpp-cheat-sheet/blob/master/General/Complexity%20Chart.png

There's a nice layout of various aspects of time complexity based the operation done on various operations:

https://github.com/gibsjose/cpp-cheat-sheet/blob/master/Data%20Structures%20and%20Algorithms.md
Topic archived. No new replies allowed.
Pages: 123