In all of my tests I always regulate the fps by using this method.
const int fps = 30;
const int msPerFrame = 1000/fps; // 33 ms per frame, so it's more like 30.3030 fps
while(true)
{
const int start = SDL_GetTicks();
// do stuff
const int end = SDL_GetTicks();
const int delay = msPerFrame - ( end - start );
if ( delay > 0 )
SDL_Delay( delay );
else
std::cout <<
"Warning, main loop took " << -delay <<
" ms more than it was allowed." << std::endl;
}
At the beginning of the game while loop I set a start = SDL_GetTicks(); then I do various game logic followed by rendering and then finish it with end = SDL_GetTicks(). Finally, I do the regulation by this: delay = 1000/fps - (end - start); then call the SDL_Delay(delay); if it was greater than zero.
My question is, is this actually doing what I want or is there the possibility that there might be fluctuations? I got to wondering this because sometimes the smallest drawing functions make my cpu usage go up.