Hey all,
Quick question if I may.
I'm building a game, which is helping me extensively in my learning C++, and I came across some arithmetics regarding a function that is baffling me because I'm not exactly sure what's going on and I was hoping someone would help explain the confusion away.
Here's the function in particular I'm confused about the math with, or to be more precise, how the program is handling the math.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
|
void marbleGame::updateGameTimer()
{
cout << startGameTimer << " startGameTimer inside updateGameTimer Func." << endl;
cout << timeGetTime() << " timeGetTime inside updateGameTimer Func." << endl;
cout << lastGameTimer << " lastGameTimer inside updateGameTimer Func." << endl;
currentGameTimer = timeGetTime() - lastGameTimer; // Set current game time to evaluate against.
cout << currentGameTimer << "currentGameTimer inside updateGameTimer Func." << endl;
if (currentGameTimer < FPS_SPEED)
{
return; // only start updating timer if greater than FPS_SPEED.
}
gameFrameCount++;
lastGameTimer = timeGetTime();
}
|
To give you a quick summary, I'm currently testing the processor speed of my comp using the platform dependent windows.h header file, the timeGetTime() function and my user defined function updateGameTimer(). I placed the cout statements in updateGameTimer just to test the outputs for debugging purposes.
There is an infinite game loop that will quite when the player enters the 'q' character for "Quit" and inside that loop is another loop that runs updateGameTimer() so I can slow down the game speed to about 30 fps. Which has been defined in FPS_SPEED and all variables, except for gameFrameCount of course, have been declared as doubles inside the marbleGame class.
At the beginning of the program I have the following variables initialized to zero and the return value of timeGetTime() assigned to startGameTimer. I entered this here too make it easier for you to follow what I'm doing.
1 2 3 4
|
gameFrameCount = 0;
startGameTimer = timeGetTime();
lastGameTimer = 0;
currentGameTimer = 0;
|
Now, my confusion comes when I debug and step through the program. Everything works as it should but I'm not sure how the math is being handled on the program side. I'm sure it's something simple I'm forgetting.
When I performed a debugging test, these were the last values returned to the console:
1 2 3 4
|
2.94779e+008 startGameTimer inside updateGameTimer Func. // initialized at beginning.
294782630 timeGetTime inside updateGameTimer Func.
0 lastGameTimer inside updateGameTimer Func. // changes at end of loop.
2.97484e+008 currentGameTimer inside updateGameTimer Func.
|
What's throwing me off is currentGameTimer value that is returned from the currentGameTimer = timeGetTime() - lastGameTimer; expression. In debugging, the return value is shown to be:
currentGameTimer 294784153.00000000 double
and it's that number that is evaluated against (currentGameTimer < FPS_SPEED) which of course evalutes to false and skips the inner while loop continuing on. However, the value that is returned to the console window is:
2.97484e+008
Which is not greater than 33.33 (defined FPS_SPEED value).
Could someone please explain this? Why is the program using 294,784,153.00 (two hundred and ninety-four million which I'm assuming it to be because the numbers are left of the decimal point) to evalute against but is printing 2.97484 which is only a 2 with decimal values following it?