OK... Someone please tell me if I'm doing the math correctly here. This is in relation with my previous post entitled Spritesheet Animation in OpenGL. TheNut used a fixed frame rate in his example, but I thought I'd go with frame rate independent animation as fixed rate isn't working for me. Lagging a bit. So, I calculated time elapsed of the current iteration and the previous iteration of the game loop and used it to calculate delta time for that interval.
Current_Time = RetrieveElapsedTime(); // Initialized in main()
New_Time = RetrieveElapsedTime(); // Called after first initialization
Delta_Time = New_Time - Current_Time;
Current_Time = New_Time;
Going by the best of my mathematical knowledge, I would say the equation for frame rate independent animation would go something like this,
⇒ Frequency ∝ 1 / Time
⇒ Frame Rate ∝ 1 / Delta Time
⇒ Frame Rate ∗ Delta Time ∝ 1
⇒ ∴ Frame Rate ∗ Delta Time = Constant
So... Going by that logic, in order to get constant animation in my sprites without a visible change in the smoothness, I should multiply the delta time I just acquired with the current frame rate of the display and use that to increment my frames in the animation. Assuming that I am right (please confirm this), here's my question... How would I go about calculating the current frame rate of the loop? I searched the oceanic realm of the Internet, but I couldn't find my water drop. I did read about variable frame rate drawbacks and how to limit it at a certain maximum, but couldn't find much more than that. Please help. Thank you.