Resolution of delay function

Hi folks,

In the function below, what is the resolution of the delay? Here's an example of the data when I run. So, elapsed time = (2451315171111 - 2451297499881) / 3579545 = 4.9367251983143108970553520070288 sec. Are there 7 significant figures in the answer, so the resolution is 0.0000001 sec or 0.1 microsec?

Thank you!

frequency = 3579545
currentTime = 2451315171111
startTime = 2451297499881

void delay(double delay_s)
{
LARGE_INTEGER frequency;
LARGE_INTEGER startTime;
LARGE_INTEGER currentTime;

QueryPerformanceFrequency(&frequency);
QueryPerformanceCounter(&startTime);

do {
QueryPerformanceCounter(&currentTime); }
while (((currentTime.QuadPart - startTime.QuadPart) / frequency.QuadPart) < delay_s);
}
The resolution depends on the computer and how Windows has decided to schedule the thread. More accurately, the time it takes to make one call to QueryPerformanceCounter(), plus the time it takes to evaluate the condition, plus one conditional jump. It's impossible to say with more precision. It could be 1 millisecond in one computer and 1 picosecond in another.
Thank you, helios. I added some statements to print the startTime, currentTime, and elapsed time when the function finishes. It appears as if the routine is very accurate at delaying a specifc number of integer seconds, but does not work when the delay is a real number. For example, a delay of 2.4 rounds up to a delay of 3.

What is the best function that delays a certain amount of time with a known accuracy? I wrote a similar function to the one above that uses the clock() function, but I read that QueryPerformanceCounter and QueryPerformanceFrequency were more accurate and changed it to the above. It looks like I should scrap this code and go back to using the clock() function.
Actually, I just noticed. You're losing a lot of precision when dividing by frequency. I wouldn't expect that function to have a resolution better than 1 s. Try this condition instead:
(currentTime.QuadPart - startTime.QuadPart < delay_us.QuadPart / (frequency / 1000000))
delay_us should be a LARGE_INTEGER representing microseconds.

Sleep() has a resolution of 1 centisecond.
Topic archived. No new replies allowed.