How many nanoseconds, or microseconds or milliseconds does it take for each iteration?
It depends on the compiler, the compiler's settings, the hardware on the user's computer, how much scheduling time the program gets from the OS, and a dozen other factors. It does not translate cleanly to a length of time.
In this case... a smart compiler will optimize out that entire loop because it does nothing, and will merely do i = 1000;, which is a single mov instruction (about as fast as it can get)
And why do some use it as a timer mechanism?
They don't.
What they're actually doing is looping something a thousand or several thousand times and taking the average (mean) run time. Since each run can be of varying length (due to above mentioned factors)... taking the average gives you a better idea.
Plus it weeds out issues with timer resolution. If your timer can't time anything faster than 1 millisecond... then timing something that runs for 20 nanoseconds will appear to take as long as something running for 500 nanoseconds. So running for several thousand times will bring that number up to somewhere that's easier to measure.
Based on what I see there, I think I am talking to a similar device as they are. I notice they use gettimeofday to count ellapsed microseconds until the difference reaches a max of var, in their example. Well, I was wondering why they were using gettimeofday and I am on linux too, so I created this test program:
And I noticed the ellapsed time between actual begin and actual end is 735642 - 735558 = 84. That is 84 microseconds. It is interesting how the loop only goes 80 times and the length of time it took is 84 microseconds. It almost seems as if each iteration of loop is a around 1 microsecond. I am on Ubuntu 12.04, intel quad core cpu, 3 gb of RAM. And I am using 32-bit version of ubuntu. So I was just thinking probably they use gettimeofday here because it returns microseconds and they can use that to "wait" until around 80 microseconds passed. At least that is what their intention seems.