Okay, I'm having a bit of a problem with a game I have been developing for a while now but I'm not sure how to give a good explenation of what's going on without giving out the entire source code, so I'm hoping that this problem sounds familiar to someone and I can be pointed in the right direction, because at the moment, I am completely stumped.
Okay so the game is a basic 2D vertical scrolling shootemup. I'm using Direct3D to render my sprites as a simple texture rendered onto 4 verts, and I have just converted my time stepping from variable to fixed, however I noticed this problem slightly before I decided to change the time stepping.
The problem itself is that basically any moving sprites to not move smoothly. They will move for a few seconds fine, but every few seconds theres a bit of a jump that sometimes is barely noticable and other times makes it very ugly to watch.
I've traced through the code quite a few times, profiling different areas, such as how often render is called and how often update is called. I'm also pretty sure there are no runaway loops or major bottlenecks in the code. I have a hunch that it is because using fixed time steps sometimes allows large gaps of no updates and then suddenly it calls update several times. But that doesn't explain why I was noticing it while I was using variable time steps.
In case youre wondering my fixed time steps are worked out with the following code
realTime = CurrentTicks() - startTime;
while(gameTime \< realTime)
gameTime += TIME_STEP;
What I've noticed is that Update may not be called for aprox 4 calls to render on average, but sometimes there is a spike in this and it may not be called for 5 or 6 times and I think that is what is causing the jitter. However I'm really not sure and I'm finding it harder and harder to describe my theories.
Anyway, I'm just wondering if anyone else has had similar experiences and what the cause/solutiuon was. I'm hoping it might give me some ideas or help me see the light at the end of the tunnel. It just seems like I might have to live with it for now, but I dont want to code much more in case it makes it more complex to fix later on
Thanx in advance to anyone who even takes the time to read my ramblings
It may be the timing loop. Cant really say. What is the value of TIME_STEP. You could get multiple update calls if it takes longer to render one frame then TIME_STEP. Cause then GameTime += TIME_STEP will still be less then the RealTime. Your main goal is to update every TIME_STEP millieconds right? If so then here's an alternative you can try that I know works.
if( lastUpdate + TIME_STEP < CurrentTick() )
lastUpdate = CurrentTick();
The code above though will also cause problems if your frame takes too long to render. There wont be any jumps, but there will be stalls.
TIME_STEP is at the moment 30. Meaning each update updates the physics by 0.003 seconds.
The theory for the game loop came from this presentation I found,
as well as other theory from flipcode.com,
The problem with what you have suggested is that I need to update the physics so that it is in time with the real time. So if there is a difference of 60 between the last update and the current time, I need to update 2 steps of physics to make it current.
The rendering process hardly takes any time at all since it is only a 2D game and currently there is not much happening on screen, and most likely there will never be a point in the game that puts strain on the rendering pipeline. The physics are almost always going to take the longest to calculate. I have a feeling this is where my problem is. But the more I think about it the more my brain turns to ooze and I get no-where. I think it's one of those days.
The problem with what you have suggested is that I need to update the physics so that it is in time with the real time. So if there is a difference of 60 between the last update and the current time, I need to update 2 steps of physics to make it current. [snapback]14602[/snapback]
Okay, well after a lot of searching around, a lot of testing and a lot of cluelessness, I've found out a few things.
The fixed time stepping was causing the majority of the jitter purely because of the nature of it. Interpolating between steps on the render stange helped that a lot however I'm still getting some jitter on motion with an acceleration or deceleration, I'm guessing because my interpolating is sketchy at best right now.
I also found out that the jittering I was having with variable time stepping seemed to be because of the fullscreen mode I was using in direct x. I'm not entirely sure why, but I noticed changing from 16 bit fullscreen to 32 bit made a very big difference, at least in fullscreen.
Thanks for the follow through. So it turns out there wasnt really a problem with the variable timing, but just a performance problem. I remember a problem like this when using the Tokamak physics SDK. You'd pass the simulator the frame time each frame, and if one frame time was really different then the previous one the simulator would choke. The way around it was to make sure the time of each frame didnt change too much and slowly iterpolate over changing frame rates.
*looks for code*
static float fLastElapsed;
// ms is the frame time in milliseconds.
fElapsed = (float)ms / 1000.0f;
// Checks if elapsed time is more then 20% faster/slower then previous frame
// and makes sure it dosent exceed a 20% spike.
if( fLastElapsed != 0 )
if (fElapsed > fLastElapsed * 1.2f) fElapsed = fLastElapsed * 1.2f;
if (fElapsed < fLastElapsed * 0.8f) fElapsed = fLastElapsed * 0.8f;
// This part makes sure the frame time dosent exceed 22 millisec (1/45th of a second
// 17 milliseconds frames is approximately 60fps.
if (fElapsed > 1.0f / 45.0f) fElapsed = 1.0f / 45.0f;
fLastElapsed = fElapsed;
Not sure if the above would help, but no harm in trying eh.
You might want to try triple buffering. Set the number of backbuffers to
- In fullscreen mode this should help a ton. Also, you could try using D3DPRESENT_INTERVAL_ONE to keep your fps at 60 (or whatever the monitor refresh is at). 60fps is very nice - I highly recommend it.
I haven't read your whole post, but maybe you're facing the problem of timer precision. You're probably querying the timer too fast that it returns the same tick count for a couple of frames, such that if you're calculating deltaTime (time between each frame), you'll get zero. This will cause the jitter.
You can either turn on vsync or use a more accurate timer.
What counter are you using for your CurrentTick() function? I'd recommend using a high-precision query counter like QueryPerformanceCounter() if you're using windows or SDL_GetTicks() if you're using SDL. You can find more information here.
I was switching between GetTickCount() and QueryPerformanceCounter() during some testing and didn't really notice a big difference, so I believe it's all okay around that section.
I don't think I need two back buffers, mainly because the drawing is very simple and using two seems kind of excessive. I am using D3DPRESENT_INTERVAL_DEFAULT for my presentation interval. I have a feeling this defaults to ONE, however I'll try explicitly setting D3DPRESENT_INTERVAL_ONE and see if there's much or any difference.
Yes, the jitter at least in windowed mode seems to be caused by performance loss. I noticed that if I moved windows around or loaded up something, I would get large jumps in the motion. I will try averaging out the framerate. This idea had crossed my mind previously but I was unsure of how to implement it and which functions would benefit from it. I'll give it a go later today and see how I go.
In fullscreen however, the jitter seems to be caused by incorrect rendering. In 16 bit mode, other apps seemed to be fighting for drawing focus and showing through. I switched to 32 bit mode and everything was perfect and smooth. I don't believe the actual amount of colours has to do with anything, rather the mode I'm using for each one. I'll do some more playing around with the different modes today also.
Thanx for the ideas guys.
Okay first thing, D3DPRESENT_INTERVAL_ONE didn't change anything because that is the default.
However I believe I have sufficient information to deal with this problem now. The first thing as I said in my other post was that fixed time stepping needs to have an interpolating rendering stage so we can work out an entities position in between logic updates. I have implemented this now as best I can, and there is still a little bit of jitter due to varying velocities but it is hardly noticable unless you're looking for it.
Next, I found a rather large problem with my particle system. It was using an array of 50,000 particles and each update it would cycle through the list looking for particles with a lifetime, this obviously caused some lag. It is using a linked list now and is much better.
Now the fullscreen problems I was having, jittery motion, and some focus fighting, I haven't solved completely but I have some theories. Firstly, the focus fighting was with winamp, and winamp is set to always on top so windows was trying to draw both. This didn't seem to cause a problem in 32 bit mode, but in 16 bit mode it was interferring wildly. I believe this could be due to the fact my desktop is in 32 bit mode, and when I go fullscreen in 16 bit mode it forces winamps window to be converted and then causing some computational lag and the jitter.
There is one more problem I'm having with jitter in 16 bit mode, even when winamp isn't fighting for focus. It seems every second I get a small jump. I'm thinking this could have something to do with a framerate/update rate synchonisation problem and could possibly be resolved with the averaging of framerates as suggested earlier.
However I'm satisfied with it at the moment and will most likely come back to it to perfect it later, but now I'm concentrating on getting some other functionality working so I can release a test version for people to try.
Just a quick thought...
I've messed around a bit with the DirectX Present() Function. I've noticed that it's not exactly perfect. It seems to miss a few vertical blanks every second. Some seconds are worse than others.
If it misses enough blanks in one second that can lead to some jitter as your interpolation tries to adjust.
You might want to start isolating parts of your render loop and timing them to see what is causing you to lose frames...if it is the Present() funtion ... at least you know its nothing that you're causing.