I use this to check the time since the last update:
1 2 3 4 5 6 7
|
// time since last update
float t = ((float) clock() - lastUpdate) / CLOCKS_PER_SEC;
t *= 1000; // ms
printf("time since last: %f\n", t);
lastUpdate = clock();
|
But my values have strange jumps:
time since last: 21.740000
time since last: 0.538000
time since last: 0.268000
time since last: 0.674000
time since last: 8.456000
time since last: 15.274000
time since last: 0.584000
time since last: 0.548000
time since last: 21.036001
time since last: 0.556000
time since last: 0.354000
time since last: 0.748000
time since last: 8.172000
time since last: 15.240000
time since last: 0.550000
time since last: 0.352000
Is it my program or is this not a good way to measure time?
Also now i think about it. My program runs at 60 frames max, but often less, around 50 frames. 1000ms/50 = 20ms. So a value between zero is extreme low.
On the other hand it only updates certain parts when there is input from a webcam, and if there is it has to calculate really a lot. So it can be normal right?