zapshe wrote: |
---|
Hence why Big-O is a measure of asymptotic time complexity and not actual time or algorithm logistics. |
So why are you so focused on counting loops then?
Don't misunderstand me. I'm terrible at describing this. I don't mean you can use it to know how much time it will take to run, but it says something about how the time changes as the input size grows towards infinity.
zapshe wrote: |
---|
It isn't unnoticed how artificial and bad of an example you have to make to even try to show something that can prove your point. |
Maybe you don't understand my point. Or maybe I don't care that it's "artificial" because it shows my point.
I just tried to demonstrate that those constants are not reliable to make useful judgements about which algorithm is faster (since they're only part of the whole picture). I'm talking in general. If you know the details about the algorithms and you know they do similar amount of work on each iteration then it could of course make sense to look at the number of loop iterations but you don't need to use Big O for that.
zapshe wrote: |
---|
there is a clear correlation between more looping and more time. Algorithms that loop less are more likely to not take as long as algorithms that loop more and accomplish the same goal. This is by no means a rule, but correlation. |
Yes, but there is also a correlation between more code inside the loops and more time.
Peter87 wrote: |
---|
If we don't care about the looping, just how much work it's doing |
zapshe wrote: |
---|
This is a nonsensical sentence. If you care how much work is being done, you care about the loops |
I don't care that they happen to be loops specifically. It would just be an implementation detail.
zapshe wrote: |
---|
Unless you have a very poorly made algorithm, an algorithmic step from two different algorithms will likely take a similar amount of time. |
Have anything to back up this claim? I think the radix-sort example you posted earlier is a quite good counterexample because the choice of radix can have quite a big impact on the amount of work that has to be done in each "step".
zapshe wrote: |
---|
Big-O, even by YOUR definition, would not be able to differentiate between algorithms in the way you're saying. One algorithm may be terribly slow but loop O(n) while another is really faster but loops O(n^2). |
No, but I think it's a relatively good hint that will be accurate most of the time if n is just big enough, "unless you have a very poorly made algorithm". But please don't let us argue over this particular point. My opinon has never been that Big O is this great "tool" that allows me do awesome things. It's
a tool that tells me some things, but not everything. There are other tools as well that might be better suited for the task.
zapshe wrote: |
---|
Why do you care if you loop n or n^2 times? You don't know the algorithm. The number of loops is worthless. |
Sometimes I feel like you're talking against yourself...
There are two goals floating around here and it's important to keep them a part when answering what is important.
Do you want to know the Big O? In that case you can often do that by counting loops, etc. (but the exact number is unimportant).
Do you want to know how long time it will take to run for some particular n? Then you should probably run and measure (either by using a timer or a profiler).
zapshe wrote: |
---|
My algorithms professor (my 2nd algorithms class) was old enough that he could die at any moment. But he was very knowledgeable, explained time complexities, and taught very well. There was never a point where he mentioned the specific workings of one algorithm vs another when talking about time complexity. |
Sounds like he is much more knowledgable than I am. You should probably trust him more than me.
zapshe wrote: |
---|
The reason: it doesn't matter. You cannot have a reliable system of algorithm comparisons based on how many operations. However, there is a clear correlation between the number of loops and how long an algorithm will take - hence the birth of time complexity. |
I don't remember exactly how I was taught, but I remember they counted operations at some point ...
I guess there are different angles to approach it from.
It doesn't matter because those constants don't mean anything inside the Big O notation. They might mean something outside the Big O notation, but not inside.
zapshe wrote: |
---|
I don't know what there is left to say. |
You're right, we should probably try and round off this discussion. It feels like I'm starting to repeat myself.