fbpx
24 Gennaio 2020

2020 will be the best year for Real-Time Rendering in Events

by Enzo
Enzo
Posted on 24 Gennaio 2020
Photo credit: Martin Sanchez | @zekedrone

Much like in other sectors, Events and Entertainment have a demand to visualise projects before they are realised. Hardly any event, these days, is presented to clients and investors without the support of 3D models and renders. Particularly in the Convention Aziendale sector, there is a strong emphasis on presentation quality for high-calibre clients and their brands. These range from simple 3D Sketchup models to high-quality photo-realistic renders e animations.


Is Real-Time Rendering changing the game for Events?

Real-Time Rendering and Visualisation are hardly new concepts: the first 3D computer games using such technology appeared in the '80s, but for many years, despite impressive quality advancements, they remained confined to the Computer Graphics category and were generally recognisable as such. It's only in recent years that Photo-Realism started blurring the lines between the real, and virtual worlds.

Pioneered by the Videogames industry, Real-Time rendering has since found many applications across a multitude of sectors from Architecture to VFX, Product Design and Automotive, while the Events industry is still lagging behind.

In Event Production, we've seen Real-Time Visualisers for quite some time. CAST's Wysiwyg has been around since the '90s, and many other visualisers have followed suit, but these tend to be specific to Lighting Design, and impressive as they are, they offer little control in terms of realistic rendering quality, besides lighting and video.
Even The industry-standard software in event production - Vectorworks' Spotlight, offers no Real-Time rendering besides its Visione package (also a Lighting Visualiser).

With no shortage of Real-Time renderers like Lumion or Twinmotion for Architecture, why is the Events Industry so slow to catch on?

Traditional 3D Rendering

Traditional, or - Offline Rendering - happens on a frame-by-frame basis. For photo-realistic results, each image (or frame) is individually calculated by the rendering engine through a process called Ray Tracing.

This process, as the name implies, traces the rays from light sources in the scene as they collide with 3D objects. The process is calculated for each pixel in the frame multiple times, depending on the number of bounces, i.e. the number of times a ray is reflected off an object and onto another, simulating how light behaves in the real world. The more bounces are calculated, the more realistic the image will appear.

This method is able to produce highly detailed, great quality results, but it is very resource-intensive, leveraging the full power of the CPU and RAM. Depending on the power of the PC, and the complexity of the scene, it can take anything from a few seconds to several minutes (or hours) to render a single frame. It is well suited to still images, but rendering animations can keep a machine occupied for several days (or weeks).

For this reason, larger studios employ Render Farms to speed up the process significantly, but this option is outside the scope and budget for most ordinary users. Outsourced Render Farms is another option, but these too can be costly, and can often present compatibility issues between various software packages.

So what exactly, is Real-Time rendering?

Unlike the traditional method above, Real-Time Rendering relies mostly on the GPU (Graphics Card) to process calculations in Real-Time. Rendering a single image is instantaneous and as easy as hitting 'save image', while rendering an animation takes barely more than the length of the animation itself, once the scene has been set up. Sounds almost too good to be true, and until very recently, it was.

GPUs couldn't handle Ray Tracing in real-time, so many options available in offline rendering were out of the question. While materials and textures could have a great level of detail; light bounces, global illumination, reflections and refractions and other elements that help make a scene more realistic, had to be simulated rather than calculated, making the difference between offline and real-time renders very noticeable.

To close the gap, real-time engines use a process called light baking, or lightmaps, where some of the calculations are performed offline, then 'baked' into the scene. This works well for static objects, but since the light and shadows are static, moving or animating an object in real-time would leave its shadow, reflections, etc. fixed in place... Enter RTX!

Last year, Nvidia announced its RTX Series of GPUs, successors to the popular GTX Series, bringing Ray Tracing to the world of Real-Time rendering and closing the quality gap between offline and online rendering once and for all. (Well, almost)
Rendering engines like Unreal e Unity, leveraging the power of RTX architectures, can produce results every bit as convincing as their offline counterparts, slashing rendering times to almost zero, and eliminating the need for render farms altogether.

Why is it a Big Deal?

Until very recently, high-quality photo-realistic renders, animations in particular, were a luxury only afforded to clients with larger budgets. Rendering Studios charge between $5.000 to $15.000 for a 1 min clip. In the grand scheme of an architectural project, for example, a budget allocated to rendering services is a relatively minor expense, particularly if it's a deciding factor in winning a contract potentially worth millions.

Unlike architecture, the Events industry revolves around tight budgets and even tighter deadlines, with exceptions far and few in between, so it would make no sense to invest a considerable amount of resources in a presentation, which even if successful, can only yield modest profits.

We've already seen how Real-Time Visualisers such as Wysiwg e Visione have made a huge impact on Lighting Designers, opening-up possibilities which were unthinkable in the days of on-site Lighting Programming. Light shows have become exponentially more complex and engaging, thanks to the technology, but they are specialist tools aimed at Lighting Designers and Programmers and are generally employed at the production stage, once the Event has already been confirmed and secured.

Real-Time Rendering could open up a whole range of opportunities to a much wider audience like smaller productions still in the pitching phase, looking to impress clients with much more than simple still images. Anything from complex animations to interactive virtual tours and augmented reality would be at their fingertips at a fraction of the cost, and timeframe, compared to what would have been required even just a few months ago.

What can it be used for?

The range of potential applications for Real-Time rendering in the Events Industry is every bit as wide as in other sectors where the technology has long been adopted.

There is a myriad of ways in which Real-Time Rendering can help bring presentations to life and create engaging content:

Complex Animations

Besides camera moves, any other element in the scene can be animated including lighting, furniture, scenery, video and projection screens and pretty much anything else. Unlike with traditional rendering, we don't have to wait to see the results.

Game-style interactive walkthroughs

This offers significant adavntages compared with standard 360° photo based virtual tours. Spaces can be visualised in different configurations and we can dynamically select different options for colour, furniture, lighting and much more. Trade fair organisers could show different configurations for exhibition stands or booths. Venues could explore different possibilities for their spaces, or Museums show an archive of past or future exhibitions.

Augmented / Mixed Reality

3D elements can be viewed in a real-world context through AR glasses like Microsoft's HoloLens, and the use of this technology doesn't have to be limited to presentations only. It can be integrated right into the events as well, offering audiences a truly unique perspective. Even a simple Powerpoint presentation can turn into an engaging immersive experience.

Real-Time Rendering, Augmented Reality
An example of Real-Time Augmented Reality

What's the hold-up?

Much of the software technology around Event and Entertainment is focused around production. Tools like Vectorworks Spotlight offer unparalleled control over every aspect of a production's planning and design stages and Lighting Designers are spoilt for choice.
While in other sectors the initial presentation of a product is a key, determining factor, much of the Events Industry put their trust in Organisers, Production Companies and Venues on their past credentials, to deliver on their promise.

With the cost of high-quality renders in the past and many of the rendering software packages targeted at other sectors and lacking features specific to events, it is perhaps of little surprise that the industry has been left somewhat behind.

Conclusion

There is no doubt that as technology evolves, so will demand, and it's only a matter of time before the Events Industry will embrace the unquestionable advantages of Real-Time Rendering. With many production software packages already on the market, there has never been a better time for Production Planning and Design, but there is still a long road ahead, full of exciting developments.
I'd urge anyone to exploit the many possibilities Real-Time technology has to offer, in order to stand out from the competition at this early stage before it all kicks off.

Want more?...

Enter your name and email address below and we'll keep you informed when new articles are posted on our Blog.
[simple-author-box]

Lascia un commento