As this is my first post on Infinityleap, it should set the tone for the ones to come. Being a practical boring person, my theme will be mostly technical analysis on the stuff I work on. The subject this time is a WebGL rendered virtual reality musical video clip called Livyatanim Myth. Yeah, I know that was long, but at least it’s descriptive enough.
At some point in mid 2015, Israeli director Or Fleisher of Phenomena Labs fame approached me with a fantastic plan. He wanted to create an immersive experience combining the music of the awesome Livyatanim band with some dark surreal scenery. I’ve done quite a bit of graphics work by then but not so much for art. Yeah graphics are actually useful in fields besides art and games, like 3D printing for example (http://www.my3dgreece.com).
I found the proposition very intriguing and so naturally I accepted it. Or handed me his blueprints, some awesome models by the very talented 3d artist Tomer Rousso, a beautiful Livyatanim song and the midi file that generated it. This last point is very important because the main feature of the film is the syncing of music with the visuals.
The midi file is parsed in the browser and an event list is generated. At every iteration of the render loop, the audio track is checked and based on its currentTime property, this list is traversed and events are picked up. These are then interpreted in several ways depending on the current scene of the timeline. A nice side effect that came up while developing the midi integration is the capability to utilize real time midi events. This was made possible by a very recent addition to the web ecosystem called WebMidi. Since this post mostly concerns the graphical aspects of the film, I will not go into detail about the midi aspects but you can get all the info in the following short video tutorial, in case you’re interested in using the site as a VJing tool or just for your own amusement.
Now let’s head on to graphics. Standard derivatives were used extensively for the line drawing on the terrain and whales. Almost everything is rendered in shades of gray, except for the whales at times. The ribbony shader emits gray from plain diffuse lighting, with derivatives and sines modulating its opacity to do the effect. The shader for the objects which are meant to appear as Ascii also outputs diffuse but with the blue channel zeroed out and without any opacity modulation, so it basically draws solid geometry with a yellowish tint. The framebuffer from the base render is then downsampled and blurred in two passes with a standard separable gaussian to produce a bloom texture. A texture containing the depth of the scene is fed to 3 consecutive passes of radial blur to produce a texture with sun shafts as per the standard crysis technique.
These two textures are finally added on top of the base rendering. This happens in a single post-processing shader which also modulates output with some noise for the film grain effect and picks up the pixels that had the blue channel zeroed out. These are converted into Ascii using a very clever technique courtesy of Shadertoy. One more sample is fetched from the same base texture, but quantized on UV this time. This is done to create a fixed grid of characters. Every fragment knows which character is meant to be shaped in its cell as it references the same quantized input sample with the rest of the pixels in its cell. The really interesting part is the way input samples are interpreted. Instead of doing another texture lookup to fetch the character info, luminance ranges are mapped into a set of float constants which are crafted so their bit sets contain the character shapes. Just brilliant.
The only effect not rendered by the main postprocessing shader is the glitch, a final optional pass that just does texture displacement and gets toggled as needed. I considered merging it along with the other effects but eventually opted not to. My reasoning was that when not glitching the rest of the chain would be slightly faster and when it’s actually enabled then stuttering is not really an issue and could even enhance the effect.
As this post has already gone way too long, I think that’s more than enough for now. If you liked this breakdown, more like this will come in the future. Have fun and enjoy the music