I’ve been looking into a simple thing and its proven to me that while a simple thing, its pretty complicated to get your head around it. Does that make it not simple? Its a game loop because I’m interested in game design at the moment and game engines are pretty interesting.
Its the concept of performing one routine at a fixed rate frequency like 20 times a second and in the time between perform another routine as many times as possible. Basically its a game loop:
int main(int argc, char **argv)
{
gboolean bGameDone = FALSE;
gboolean bNetworkGame = FALSE;
gboolean bCanRender = TRUE;
tickCountAtLastCall = ticks();
while(!bGameDone) {
newTime = ticks();
frameTicks = 0;
numLoops = 0;
long ticksSince = newTime - tickCountAtLastCall;
while((ticksSince) > TICK_TIME && numLoops < MAX_LOOPS ) {
GameTickRun(); // logic/update
// tickCountAtLastCall is now been +TICK_TIME more since the last time. update it
tickCountAtLastCall += TICK_TIME;
frameTicks += TICK_TIME; numLoops++;
ticksSince = newTime - tickCountAtLastCall;
}
IndependantTickRun(frameTicks); // handle player input, general housekeeping
if(!bNetworkGame && (ticksSince > TICK_TIME))
tickCountAtLastCall = newTime - TICK_TIME;
if(bCanRender) {
float percentOutsideFrame = (ticksSince/TICK_TIME)*100;
GameDrawWithInterpolation(percentOutsideFrame);
}
}
return 0;
}
I’m convinced that because I spent an entire evening trying to understand how this works, that it is some kind of black magic crafted by a magic person.
It works pretty well. In this instance the GameTickRun() is run 20 times a second – ie every 50 milliseconds. And when its not being called, its calling the IndependnatTickRun() and GameDrawWithInterpolation() as many times as possible thereby in this instance, maximizing the rendering but ensuring that the gameplay is consistent irrespective of how fast the CPU is – 20 times a second is 20 times a second in this case. This allows the game to happen at a predictive pace but rendering happens as many times as the CPU can run. Very clever.