Porting from SDL1.2 to SDL2 went pretty well with one exception.
I just ported Squares Game (yes it’s a bad title I didn’t want to waste energy thinking of a good one) from SDL1.2, to SDL2. There are a bunch of new features in SDL2, not least of which being hardware acceleration. Hardware acceleration doesn’t actually give anymore features than software acceleration, but it does make things much faster, although the game is so simple that it doesn’t need the speed. You have to use SDL_Texture and SDL_Renderer, and a few other small differences, but then you’re good to go.
Problem is, I’m not quite using them properly. I noticed that the game was hitching, so I wrote a very small function that checks if the game spent too long on a frame, and prints out “Game running too slowly\n” (the \n is equivalent to hitting enter) to the command line. This happens about once every 4 or 5 seconds. That’s understandable since when I call SDL_Delay(timeInMilliseconds) sometimes the operating system doesn’t give the core back to you until a significant amount of time later, due to time slicing.
Still, the OS not giving me a core for a full 10 ms seemed a bit odd, especially so often, so I decided to open up Windows Task Manager to profile my code that way. What I saw was the worst memory leak I’ve seen in my life. Every second the game allocates about 20 more MB’s, which means it quickly goes over 1 GB, and even went above 2 GB’s at one point. I’m not exactly sure where in my program this is happening, but since I’m doing all of my allocations on the stack right now, and since the program is taking 2 GB’s of memory the leak must obviously not be happening on the stack, it must be through some improperly used SDL2 code.
Unfortunately, that was a minor, and easily fixed problem. I figured out the memory leak. It was due to me not destroying four textures whose data got created each frame (the data creation is unavoidable, they display changing text). Problem is, I’ve completely gotten rid of the memory leak, and my application is stable at a very reasonable ~12MB. I’m still getting the occasional dropped frame, still about 1 every 5 seconds, but I’m willing to say that’s entirely an OS scheduler thing.*
No, the real problem is that my game as a whole just runs too slowly. I just want to make this very clear, the framerate is fine, it’s just that the game logic makes everything move slower despite being the exact same code. What makes this doubly weird, is that when I run Very Sleepy, a program profiler, everything is back as fast and snappy as I expect it to be. But even that doesn’t work right, as Very Sleepy always crashes at the end with a ‘ProfilerExcep: ResumeThread failed’ error message. WTF? Why would running a profiler while my code is executing make my code work right?
After even more work, where I fooled around with the global constant FRAME_TIME, I got correct results for when FRAME_TIME is set to 200 ms. That’s great, but I don’t really want to run the game at 5 frames per second. This is the weirdest and most irritating bug I’ve ever seen. What makes even more weird, is that this all worked just fine in SDL1.2. Is it possible that they rewrote the timer functions and that’s what causing all this? It seems like a simple logic problem that isn’t a part of SDL at all.
Eventually I ended up solving it by making FRAME_TIME 15. That basically makes no sense, as it means that FRAMES_PER_SECOND needs to truncate the result of the division of 1000/15 (which is 66.666 -> 66.0). Why this works is a complete and utter mystery to me.
I tried other ms FRAME_TIMES’s. They steadily started to work better and better until I hit 15ms. I tried higher and it works equally well, although there is a slight problem with the choppiness at only 58.76 FPS, simply due to a (not even that) low FPS.
Weirdest bug I’ve ever encountered. The only explanation I can come up with is that there is some Operating System or SDL2 shenanigans going on behind the scenes. I profiled the code (it’s working right now but I have to exit the profiler before I exit the program) and a huge portion of the time is stuck in ‘RTInitializeExceptionChain’, which is an Operating System function that, after some research, I figured out is quite prevalent if running the 32 bit version of SDL2 on a 64 bit operating system.
Update: I changed a flag in my SDL_CreateWindow(params) function from SDL_WINDOW_SHOWN to SDL_WINDOW_FULLSCREEN and it solved the problem. But then I changed it back to SDL_WINDOW_SHOWN and the problem was still gone, even when I turned the FRAME_TIME back down to 10ms. Seriously WTF? Was this some kind of compiler bug? Because I restarted the compiler to no luck. Oh well, if it’s working beautifully now that’s all I care about.
*As an aside, I realized that I had much smaller memory leaks all over the place. I forgot to destroy textures when I changed the game state, so every time I changed state I added a 100-200KB or so to the program. Fixed now.