I was looking around for prepackaged engines that would handle features like shadow and glow, and animations. But I have... a preconceived notion of how cameras should work; this notion is from having developed my own code for rotations so very many years ago I guess.
Here's the long of it...
(Google Doc link)https://docs.google.com/document/d/1_6JdZ0VplMFpBeR3QOcV5vhlOGYyn52T4-fITgI6lbc
I'll attempt to make a short of it.
Unreal Engine camera is based on component yaw pitch roll. This will always be gimbal locked.
OGRE uses quaternions in their core... and although quaternions are 'magic' they don't solve gimbal locking. Rotation operations are order sensitive; so always rotation yaw, pitch then roll will result in locking. To do rotations using quaternions you still convert them to a matrix.
Rarely do you need absolute orientations of things except as an initial condition. Most other rotations happen as small interval increments. By keeping a step counter and applying the rotations as YPR, PRY, RYP, YRP, RPY, PYR(goto 0), you can keep smooth rotations for 6DOF space craft things. To turn that into a FPS camera, all you have to do is extract the roll from the matrix and apply DoRoll( -m.roll ) and you keep the camera always-up. (getting roll is a couple if's and a simple atan2 of 2 values, not a huge computation)
The primary feature I'm very tired of is the lock at pitch +/-90 degrees. I modified the Blackvoxel engine to use pure matrix rotations, it has several camera modes (FPS for ground based, but eventually you can build a plane, which is 6DOF), and the experiment was a success! It felt much more natural to look up and up and up and eventually have the camera naturally rotate around. This also means that at any specific direction of looking, the up-down and left-right motions of the mouse always generate the same relative changes to the view... instead of left-right always being a rotation about the Y axis, which results in more significant changes when looking at targets above and below you than you would expect. When you watch a plane fly overhead your head will naturally pitch up to a maximum, then you will rotate your body and start pitching down to watch the rest...
I'm really very surprised that after so very many years of development FPS games ALL have locked +/-90 degree pitches. But; now searching so many engines and seeing how many keep their coordinates as raw yaw/pitch/roll I now understand.
The other issue is that OpenGL and DirectX take an inverted matrix from the one that's most useful. Having your X/Y/Z axis as rows in the matix, you can return GetRight, GetUp and GetForward, GetOrigin as pointers directly into the matrix (returning as const prevents external code from modifying internals rudely). So no additional computations are needed. Operations involving motion are higher in number than the number of times you need to feed the matrix to the display rendering system.
I'd really like to get this notion to propagate to the world. Please tear my idea apart.