So I kept having this horrible bug, that I detailed about 10 posts ago, come back. If you didn’t read that, basically the problem was that, while my framerate was fine, for some reason all the logic in my program would run slowly if the FRAME_TIME was set lower than 15 ms. As a brief aside, that variable controls how much time the program has to execute, which means that 1000/FRAME_TIME gives you the frames per second.
Anyway, there were two hugely irritating parts to this bug. The first, was that it kept going away and then coming back again. The second, was that even when it was happening, often the game would “pulse” between running at real time, and running at slowed down time. Every time it would just go away on its own, so I assumed that it was some small change I made to my code. This time though, I really thought for hours about what could possibly be happening with my code, probably my timers, that caused this bug.
I had a line of code in my program that checked if the program had completed all of its tasks in the allotted time, and slept for the remainder of the time, if it hadn’t. So if the game had taken 2ms to do the logic and rendering, then it would sleep for FRAME_TIME-2ms. Problem is, that sometimes the operating system would be playing nicely, and sometimes it would not be playing nicely. Finally I figured that this was probably the problem, due to the OS, and I set the function, SDL_Delay(), to SDL_Delay(1). This means that the program would sleep for 1 ms, even if it needed to sleep for a lot longer.
What happened was that the effects were more pronounced. The program would run really fast, and then really slow again, at exactly the same slowdown rate. At that point I knew it was an operating system problem. The most precision the OS, or at least windows, will give you is 15ms. That would be fine in and of itself, but weirdly enough, it’s like the OS managed to fool my program into thinking that 15ms was the time I had allotted for the frame. So if I set FRAME_TIME to 10ms, then my program ran at 67% speed, and if I set FRAME_TIME to 5ms, then my program ran at 33% speed. This is why the only FRAME_TIME settings that worked were greater than 15ms. Slower times were okay because the operating system had greater minimum time than that, and didn’t need to pretend that 10ms was actually 15ms.
tl:dr – the operating system was bullying my game into believing that any time less than 15ms was actually 15ms, in order to cover it’s own gross deficiencies.
Problem is, I still need my program to sleep. I decided on the hack way of doing things, which is just to make the processor do a bunch of calculations in a while (stillTime) loop, where you check if it’s time to run the next frame yet. This has the downside of being a massive waste of power and processor resources.
Oh well, at least the game is nice and snappy now.
For the record, in the process of figuring this out, I added a bunch of command line debugging. It’s pretty striking how obvious OS time slicing, a different but very related thing, is. I have the game set to 3ms per frame, or 333fps, and 99.9% of frames are hit, because the game is very simple, but occasionally there will be a frame that takes 4-15ms or so, which is a clear case of not getting the processor.