I have been working for a couple of months now on a wipeout like racing game available on the Windows Store (8.1) for free.
AirMess Window Store
Still on an early stage but I'm updating it every month so you can expect regular progress on it . Currently in the box I have:
- Different kind of items, from shield to rockets
- Different ships
- 3 races (more coming in the August update)
- 2 race modes either the normal race or survival, which is like the zone mode in Wipeout.
- Play with either the ingame music or your own music library
- Support both keyboard and gamepad input
- Does support touch input ... but not fast enough yet on the origin Surface RT. It does achieve a solid 30 fps on Asus x64 tablets. If you play on a PC the game can be configured to run at 60 fps. Currently the ideal minimum system lays around an Intel i5 CPU and a Intel HD Graphics 3000 GPU to have a solid 30 fps and a 60 fps when turning off some effects / downscaling. It does support both DirectX 9 level hardware and DirectX 10 + level ones.
I'm currently working on the August update which adds more races and look like that
I also have a development blog and an indie db page.
The engine is custom made using full C# / SharpDX / HLSL and it can be played on x86/x64/ARM devices (ARM is slow ... but still).
Everything up to the UI is built as a C# Portable Class Library meaning the same code as-is on all CPU platforms, and on Windows Desktop and RT. Theoretically it could also run on Windows Phone (but that's too slow) and Xbox One (but nothing was announced outside of the latest //build demo).
Currently AirMess is running on Windows RT (released) and Windows 8 Desktop. The Desktop version isn't released as I'm only using it for dev purposes. Basically it would be a hard work to release it as:
- I'd need to update the audio pipeline (I'm using FMODEx+XAudio) to work on Win7 because I had crashes before. Didn't investigate as I'm deving on Win8
- As-is, the DX10+ renderer crashes on Win7
- I'd need to create a in game UI for desktop. I do have Direct2D support for basic ingame UI but on WinRT I can overlay XAML on top of the game (and that is awesome). On desktop, it's not possible to merge XAML and DirectX without huge perf decreases ...
- It's easy to deploy on the store without having to provide hosting, setup and so on
I wrote 2 renders (on top of SharpDX's DirectX 11 API projection), one for DirectX 9 levels which is a forward renderer and one for DirectX 10+ level which is a deferred renderer with a light pre-pass.
Because of that I needed to compile all my shaders in two profiles (SM 2.0 and SM 4.0) but for that I modified the Shader Compiler tool inside SharpDX to compile in both profiles and generate a strongly typed C# code (with properties for all shader parameters ) to the two profiles (with some compilation directive in order to have different code per profile). If a profile isn't supported it will emit a "throw new NotSupportedException" in the C# code when trying to instanciate it.
Still, I have 90% ish of my shaders wich are made to work in both profiles and the rest is only turned on on the "HiDef" profile (DX 10+) like one of the music visualizer (displayed on the side of the races) which have too many instructions for SM 2.0.
When running in Reach (Dx9) there's a single directional light applied while drawing the objects.
In HiDef modes I have a light shading pass which additively add one shadowed directional light (with CSM ... but that's not very useful here) and several point lights per batches of 32 max lights (which is far enough for me).
I have shadows only for interactive objects (the ships) but the environment shadows are statically computed.
For that, I have a DirectCompute shader executed at compile time which does a ray tracing across the scene for everything that needs a shadow map and create a lightmap texture which is then sampled at runtime. By doing this I don't have any shadowmap artefacts in the environment lightmaps (except the one caused by the texture unwrapping of the models done in a 3d editor) and I'm able to blur them several times to soften them.
As I started with XNA I quickly realized those guys actually made good decisions and my pipeline closely mimics what was in there: a content pipeline with asset importers and processors, the only difference being that I'm doing that outside of VS into my own scene editor. By doing that I can actually rebuild the assets without rebuilding the game and try it there.
Everything is in C# from the ground up (+ HLSL for the shaders) and the funny part is that I'm usually GPU bound (except on the Surface RT).
I also got touch support and so on with the XAML of WinRT but it's still best played using a gamepad (who played Wipeout with a keyboard anyway?).
Feel free to try it and feedbacks are highly appreciated