Showing posts with label post-processing. Show all posts
Showing posts with label post-processing. Show all posts

Tuesday, June 28, 2016

Shimmerblade: A Side-Scrolling Platform Game

One of my newest and more promising game prototypes is a side-scrolling platform game, using retro 2D game mechanics (reminiscent of the 8-bit and 16-bit era of gaming where side-scrolling platformers dominated), while using 3D models and modern graphics techniques.




The game is inspired by some of my childhood gaming memories, borrowing ideas and elements from (apparently quite obscure) games like 'Wonderboy in Monster Land' and 'Wonderboy 3: The Dragon's Trap', both of which I played extensively on my Sega Master System back in the early 90s.


Wonderboy in Monsterland

Wonderboy 3: The Dragon's Trap


I started work on it in early February of this year (2016) and made significant progress between then and the end of May (realised that you can get a lot done in four months if you push really hard, start with a lot of good content, and make good re-use of code from previous projects). During the early stages the focus was on implementing basic side-scrolling platform game mechanics. I re-used some code from one of my projects from a few years ago called 'Shroomsters' (albeit with a fair bit of modification).

I'm targeting PC for a change so that I can focus on nice visuals without too many limitations. It's a nice change from mobile and allows me to not worry about overall scope and technical limitations as much.






I put a lot of work into the visuals -- tweaking materials, coding up various shaders and getting the post-processing effects looking good. The scenes are crafted by hand (with the help of some high-quality outsourced art). I've made use of various post-processing effects, including bloom, screen-space ambient occlusion, radial light scattering (sunshafts / god rays) and FXAA (anti-aliasing achieved via post-processing).




The scope of the game is quite a bit larger than other projects I've focused on in the last two years or so. Previously I had made the decision to deliberately scope down my games because my older, more ambitious projects turned out to be too much work to complete by myself in a reasonable time frame.

As such, rather than working toward a complete product, my initial goal is to finish a vertical slice -- that is, a portion of the game (a quarter of the final game, for example) that is complete and playable in something resembling its final, polished form. From there I can release it as a demo and, depending on how the cards fall, decide how to proceed. Despite some solid progress, a lot of work still remains. By the end of May I had run out of steam and progress slowed, but I hope to get back into it soon.




On a related note, others seem to share my enthusiasm for the Wonderboy games and have also begun working on similar projects. These are larger companies with teams of artists working on them full-time, so even attempting to compete with them would be futile. Fortunately, I've taken a fairly different approach and so mine has a very different look and feel.

I definitely plan on playing the aforementioned games when they're ready though (they both look amazing). One is Wonderboy: The Dragon's Trap (a modern remake of the original Wonderboy 3). The other is 'Monsterboy and the Cursed Kingdom' (inspired by the original, rather than being a remake).



Monday, July 15, 2013

Minor Updates to Surreal Landscape Demo

This weekend I did a but of work on my surreal landscape demo. I re-modelled the jellyfish (more detailed and better looking) and re-worked their animation (I'm just doing it in the vertex shader via sine and cosine waves at the moment).

I added some basic trees to get a feel of what it will look like once I start adding more foliage. I also added a bit more vegetation (cycads). I'm using instancing for almost everything at the moment, and so I'm using texture atlases (multiple textures combined into one image, with each instance able to use any of them). When setting up each instance, I randomly choose a texture offset which controls which image in the texture atlas is used. Each instance also has a random rotation, scale and colour modifier, resulting in greater variety whilst still keeping the number of draw calls to a minimum.



I also re-worked the way I handle bloom. Prior to this I had a very simple method in place, which I had hobbled together just as a placeholder until I had some time to do it properly. I was doing a Gaussian filter both vertically and horizontally in one pass as opposed to filtering first horizontally, and then vertically in a separate pass. I am doing it correctly now (separate horizontal and vertical passes) so as to reduce the number of filter samples required. Additionally, I wasn't performing the bright pass before passing them into the filter -- instead I was performing it at a later stage, in the final post-processing shader stage. This resulted in halos around dark objects, because the bloom buffer contained blurred parts of both bright and dark parts of the image resulting in darker parts 'bleeding' over into brighter parts of the image. This is now fixed (note that the screenshots accompanying this post were taken before the current fix, so there may still be minor colour bleeding present in these images).



I also decided to alter various post-processing parameters based on the time of day. I found that I could tweak the post-processing parameters to make the scene look good for certain lighting conditions, but that the lighting conditions differed significantly enough that no single collection of settings looked perfect in all conditions. I now have various sets of variables for use in different lighting conditions. These include parameters that control the bloom bright pass (bloom exponent and bloom multiplier), as well as parameters that control the very last phase of the post-processing pipeline -- contrast and saturation.

I find that after applying bloom some of the colours are over-saturated. Although I am going for a fantasy dream-world feel, I don't want it to look too cartoony, so in order to get the visual results I want I increase the contrast and reduce the saturation of the final image so as to give it a slightly more gritty feel.

Sunday, September 26, 2010

Implementing thermal imaging.

Over the last two days I've been working on simulating the thermal imaging sensor (TIS) for a military training simulator project. This supports two modes, white hot and black hot. In white hot mode hotter objects are brighter. The sky tends to be dark grey to black, the terrain varies from dark grey to white (for hot sand and roads during the days), and people and vehicles tend to be light grey to white. Black hot is just the opposite (hotter objects are darker), which means I only really needed to implement white hot, and added a final step in my post-processing shader which inverts the image when black hot is enabled.





Previously I had implemented night vision and incorporated it into our post-fx pipeline. This entailed adding a custom high contrast bright pass filter, a green filter (tweaked in various ways to match the reference videos) and a grainy effect (similar to film grain). The excessive contrast works well with the bloom part of the pipeline to give a realistic looking dramatic blooming effect when parts of the rendered image becomes brighter (for example, when weapons are fired).

I used the same part of the post-fx pipeline to implement thermal imaging, but applied a greyscale filter instead, and used significantly less contrast. At this stage the result was a grainy greyscale image, but there was no differentiation between hot and cold objects.

Looking at the reference videos I observed that there is significantly more contrast on thermal imaging camera during the day than at night. I implemented this in the application code itself rather than in shaders. We have a dynamic time of day and weather system which allows for any time of day and weather combination, each with its own set of diffuse, ambient, fog and sky parameters. When thermal imaging is enabled I intercept the calls to set these parameters and supply an alternative set of parameters that match the reference videos. Fortunately, I had reference videos showing TIS enabled during transitions from day to night and vice versa, so I could run the video alongside our simulator, dramatically increase the time elapse rate in our simulator, and observe the results as I tweaked the values. As for the sky, I modified the dynamic skydome shader so that it bypasses the part where it derives colour for various regions of the sky based on time of day and weather, and outputted a single colour based on time of day instead.

In the reference videos the foliage tends to range from dark grey to black (it would seem that plants aren't particularly warm). This was easy enough, I simply intercepted the calls to the SpeedTree API and modified lighting and fog parameters to give the desired look.

At this stage the scene looked very close to the reference videos for the most part, except for hot objects -- we needed some way to make infantry and vehicles stand out white against the dark backdrop. Initially I was going to do this with shaders -- either in the specific shaders for the infantry and vehicles or via post-processing. However, in the real world various vehicles have very specific heat signatures. For example, some tanks are specifically designed so that they have low heat signatures, so that they don't stand out when TIS is enabled. Similarly, many tanks, trucks and even infantry show very specific heat patterns. For example, a tank may appear dark everywhere except around its engine, which glows white, allowing it to be uniquely identified by a knowledgeable observer. As a result, a generic solution such as this would not suffice.

To obtain realistic results we needed the artists to provide an additional set of customized textures, one for each vehicle and infantry unit type. When thermal imaging is enabled, the graphics engine is instructed to swap out the default set of diffuse maps for an alternative set of heat maps. The final result is a final rendered image that is very close to the real thing.