0

Homegrown rendering with Rust. In this post, Embark software engineer… | by Toma...

 2 years ago
source link: https://medium.com/embarkstudios/homegrown-rendering-with-rust-1e39068e56a7
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Homegrown rendering with Rust

If you’ve followed what we do at Embark, you may know that beyond our games, we’re also working on a creative platform. It’s a project focused fully on user-created content, and enabling players to build their own worlds and interactive experiences.

I’m Tomasz, a software engineer here at Embark. In this technical blog post, I’ll shine a light on how we approach 3D rendering of user-created worlds, how Rust helps us achieve our goals, and share some exciting open source news.

Our creative platform in the making; rendered in real time in our engine built with the Rust language on CPU and GPU!

So, as it turns out, you actually can have too much of a good thing. When we started our journey at Embark, we had a disproportionately high number of rendering engineers on our platform project. We quickly realized it would be a trap to lose ourselves in shiny pixels at the outset, particularly as we had more important things to take care of before rendering.

Luckily, we are also generalists, and huge fans of the Rust language (which turns a regular developer into a superhero). And so, we put off graphics for a while, and instead focused on gameplay, physics, audio, and all of the other bits and pieces that a new game engine needs. We rolled with vertex colors and blob shadows for almost two years, relying on the insensitivity (and affinity) of engineers to visual transgressions.

Eventually, our true ‘rendering’ selves emerged, and could not hold off any longer. After all, when artists start faking indirect lighting, fog and Fresnel equations with vertex colors, it’s a desperate cry for help.

It became clear to us that we needed some rendering tech.

Rust has an incredible open source community, so you never need to start completely from scratch. While we found plenty of great building blocks, we did not unearth any ready-made solutions that would fit all of our needs. It was time to pull up our sleeves and get to work.

The rendering engineers rejoiced!

… perhaps a bit too much. We made not one, but two renderers. Sort of — they are closely related, share a lot of code, and both have exciting modern features, such as ray tracing and real-time global illumination.

We also wanted to bring the benefits of the Rust language and ecosystem to the GPU; hence, the rust-gpu project was born. Our rendering engineers would no longer have to choose between using a great language and writing shaders!

We’ll come back to rust-gpu later in this article — for now, let’s focus on the shiny pixels and what makes them sparkle.

User-created scene with automatic ray-traced reflections and multi-bounce diffuse lighting

Great visuals for everyone

Our Rust project has different requirements than a video game. It’s a platform that will enable everyone — not just professional game makers — to build new small interactive experiences. For rendering, this means working with user-generated content, and not requiring expert game developer knowledge to achieve stunning results.

Instead of painstakingly placing reflection probes or baking static lightmaps, we want any object — whether it’s static or dynamic — to emit and scatter light realistically. If a player wants to illuminate their virtual home, all they need to do is make an object glow, or punch a hole in the ceiling to let the sun in. Light will then bounce off of surfaces, naturally filling the space.

Achieving all of this in real-time on current graphics hardware is an active area of research, and it’s something of an unsolved problem. In the lingo of rendering engineers, that means fun.

That’s also where my personal story connects — I’ve always been interested in real-time global illumination, and spent a good chunk of my career researching it. Of course, it has been fascinating to work on gameplay and systems at Embark, but with the joy that rendering brings me, I could not resist developing a global illumination renderer as a side project. As a benefit, when the time came to tighten up the graphics in our creative platform, we were able to hit the ground running.

Today, we are sharing this experimental renderer “kajiya” with you.

Open-sourcing kajiya

A simple scene rendered with kajiya. Car model by Rust Shake.

In kajiya, RTX is always On!

No, that doesn’t mean giant reflective puddles or a hyper-realistic Spongebob. It merely means that the rendering looks right. Sometimes you might not even know it’s there — global illumination can be difficult to explain until you turn it off.

Now, before you go running and hook kajiya up to your favorite Rust game engine, there is a disclaimer: it is a work in progress, and it is shaky. Perhaps perpetually so.

The renderer isn’t built to ship games (yet), but it serves as a convenient platform for learning and research. To that end, it’s heavily opinionated, and only includes a basic content pipeline and scene model. Many commonly used game features aren’t supported yet, such as transparency, particles, skin, or hair. It doesn’t have a material graph, skinning, spot lights, point lights, or a grading pipeline. Not even window resizing. It also requires ray tracing to run (not necessarily RTX).

On the other hand, all objects participate in global illumination. There is a sun and a sky, temporal super-resolution and anti-aliasing, a reference GPU path tracer, and more. The renderer is easy to hack and extend, and it copes well with a wide range of scenes in practice.

It is also permissively licensed, and based only on publicly-available information and open source libraries, so hopefully it will be useful to Rust and graphics communities in many ways. It will continue to evolve, and incorporate latest research in rendering techniques; there is already a new global illumination solver brewing, using the latest in reservoir resampling techniques.

Check out the open source repository!

Our production renderer

Of course, there’s a big difference between a prototype and something usable in the real world. For the renderer to be useful in production, it needs to run on devices without ray-tracing features, and support numerous gameplay-specific features. That has been an ongoing effort by Henrik Rydgård, Viktor Zoutman, and Gray Olson on our platform rendering team.

The production renderer uses a lot of code from kajiya, with extensions for limited transparency, animation, dynamic mesh modification, and more. It even supports window resizing.

Post-process depth of field seals the deal

Putting real-time global illumination into the hands of our users and developers has been transformative. Anyone can create great-looking content by simply moving objects around. Worlds cobbled together by quickly kitbashing can feel solid and cohesive. Models with quite different styles fit well together, encouraging creative experimentation.

For now, we’re keeping the production renderer closed-source, but we will continue to share code and cross-pollinate ideas with kajiya and the larger open source Rust graphics ecosystem. If you are an experienced graphics programmer, and being part of this sounds like fun, we might have a job just for you.

The technical details

When we set out to build our shiny new renderers, we wanted access to cutting-edge features offered by the latest graphics hardware. That’s where we take a different approach from what’s currently popular in the Rust community — instead of going for maximum portability, we choose to target only the Vulkan API. This still allows us to run on a wide range of hardware, without compromising on functionality.

For low-overhead interaction with the GPU, we use ash (created by Maik Klein, who also happens to be an Embarker). The crate closely follows new developments in Vulkan, exposing all of the latest and greatest extensions. It also provides a thin convenience layer, but gets out of the way otherwise, leaving opinions to our own abstractions.

Interfacing with Vulkan is infamously verbose, requiring hundreds of lines of code to get even a single triangle to show. This means that you have to build abstractions on top, to make sure developers can be productive. That’s also where tradeoffs and specialization enter the picture.

OK, giant reflective puddles are possible too

Renderer structure

The rendering needs for our creative platform are pretty straightforward. With relatively tame scenes, we’re able to push geometry to the GPU with as little as basic frustum culling. Thanks to an uncomplicated material system, objects can be rendered efficiently with almost no state shuffling in between.

However, there’s no such thing as a free lunch, of course. Simple scenes and simple materials imply a lot of pressure on lighting to deliver great-looking images. That’s where we innovate, and where we concentrate our efforts to manage complexity.

We’ve structured our renderer around a code-driven graph (render graph), where a node usually corresponds to a single graphics, compute, or ray tracing pass. The graph is heavily specialized to characteristics of deferred lighting techniques and post-processing, providing an extremely simple interface for setting up data flow, creating temporary resources, and communicating information across frames.

Let’s take motion blur as an example. The typical algorithm starts by dilating and reducing the resolution of per-pixel velocity vectors. In order to do that, we must first allocate a temporary texture that will hold the result, and then run a compute shader on the GPU. With our render graph, the CPU-side setup can be as simple as the code below:

Render graph usage in a motion blur algorithm

The code running the entire algorithm is not much more complicated than this: one more compute pass performs vertical velocity reduction, and then a third one does the blurring. The graph’s compiler takes care of resource allocation and reuse, managing Vulkan details, and automatically provides profiling and debugging capabilities.

With our minds freed from the low-level details of render pass plumbing, we can instead focus on the GPU side of things. That means shaders.

Shaders

Thanks to our rust-gpu project, we’re able to use Rust not only on the CPU, but also for programming the GPU as well. Traditionally, GPU code is written in simplified shading (or compute) languages. While these have their advantages, the finesse of modern GPU code is increasing to the point where advanced language features begin to matter.

We do utilize some HLSL too — it compiles faster, and has a fairly mature backend, so it can be a better choice for quick prototyping. On the other hand, you can’t universally weigh that against the benefits of using a real programming language, so our two renderers take different stances. The experimental renderer uses a mixture: Rust for the stable bits, and HLSL for the code that is being worked on actively. The production renderer values stability, correctness, and code sharing, thus opting for rust-gpu nearly everywhere.

Using Rust for shaders has been a boon to us in many ways. For example, we can share functions and structures in a type-safe manner between CPU and GPU code via regular modules and crates. We’ve also had several cases where an expensive computation in a shader turned out to be constant for an entire render pass, and we could simply execute it on the CPU without needing to rewrite or port anything.

Unit testing a rust-gpu shader

Another thing that we’ve been able to do is unit-test our shaders on the CPU side. Typically, testing of GPU code requires a complicated setup, and hooking up real graphics cards or emulation on test machines. With shaders being regular Rust code, we can verify them as part of our usual continuous integration process.

We’re proud of what we’ve accomplished with rust-gpu so far, and that we’ve been able to use it as a base of our rendering stack — especially considering that the GPU compiler backend has been the work of mostly just two people in our team: Ashley Hauck and eddyb.

There’s still a lot of work ahead to build a mature compiler, and to further improve Rust as a language and ecosystem for GPU programming. As it happens, we’re looking for additional compiler engineers and open source engineers to join the team and help!

Ruins environment rendered in kajiya. Scene by Crebotoly

The future

We hope that our efforts will shine a positive light on Rust as a viable platform for graphics, on the CPU and GPU alike. We envision a future where Rust’s accessibility, its fantastic community, and spirit of sharing make it a popular target for graphics and games, with a rich ecosystem of crates upon which to build.

With CPU-side utilities like render graphs and backends, it could become straightforward to start assembling a custom renderer. With the addition of GPU-side crates for common math, shading models, and individual effects, game developers could craft bespoke solutions out of building blocks with the same ease as writing system software in Rust.

We plan to share more about our work and progress towards this during next year, and we are looking for opportunities to collaborate with other companies and developers. Feel free to reach out to us on our developer Discord or [email protected]

Finally, I would like to extend my thanks for help with composing this article to the following people: Anastasia Opara, Benjamin Bouvier, Diego Goberna, Dirk de la Hunt, Doug Church, Gray Olson, Henrik Rydgård, Johan Andersson, Maik Klein, Matthew Mannella, Sven Grundberg, and Viktor Zoutman.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK