Timer Millisecond strange Oo

Hello everybody, i need to create a timer for test my network platform and i wish a millisecond precision.
So i did this.

// ********************************************************
// file timer.h
// ********************************************************
// Timer class: For test perform
// ********************************************************
// On the STMicro platform, the time returned by GetTickCount
// includes a ~0.014% time drift, which is by design. This
// translates to a time lag of approximately 1s every 2h.
// The time will wrap around to zero if the system is run
// continuously for 49.7 days. msdn source.

#ifndef _TIMER_H_
#define _TIMER_H_

#define MAX_SAUV 100

#include <windows.h>

class Timer
{
public:
Timer()
{ for(int i = 0;i<MAX_SAUV;i++)
{
sauv[i]=0;
}
}

double T_Time(void)
{
return ((double)GetTickCount());
}

void T_Start(int i)
{ if(i<MAX_SAUV)
sauv[i]=GetTickCount();
}

double T_Stop(int i)
{
if(i<MAX_SAUV)
return ((double)(GetTickCount()-sauv[i]));
return 0;
}


DWORD sauv[MAX_SAUV];
};

#endif

/*************************************************
So i created this main for test this object:

int main()
{
Timer t;
double t1,t2,t3,t4;

t.T_Start(1);
Sleep(5);
t1=t.T_Stop(1);

t.T_Start(2);
Sleep(10);
t2=t.T_Stop(2);

t.T_Start(3);
Sleep(100);
t3=t.T_Stop(3);

t.T_Start(4);
Sleep(1000);
t4=t.T_Stop(4);

std::cout<<"Time: T1:"<<t1<<"ms T2:"<<t2<<"ms T3:"<<t3<<"ms T4:"<<t4<<"ms"<<std::endl;

return 0;
}
/*********************************************
Now the strange result:
Time: T1: 0ms T2: 15ms T3: 94ms T4: 1000ms


Somebody can explain me? MSDN said the fonction GetTickCount() return millisecond value reset every 49days so Oo
Last edited on
I found the solution.
Sleep or gettick are not very perform. so i use now QueryPerformanceCounter()

http://c.developpez.com/faq/index.ph...E_chronometrer it s in french but code stay code :)
See rather MSDN code (KB , WSDK) (this french site is well-known to be very bad..)
Topic archived. No new replies allowed.