27

Iconic Doom3 Game Now in Browsers With WebAssembly: Q&A With Gabriel Cuvilli...

 4 years ago
source link: https://www.tuicool.com/articles/ye2Iv2m
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Gabriel Cuvillier , Senior Software Engineer at Continuation Labs ported the iconic Doom 3 game to browsers with WebAssembly . The 7-week full-time effort illustrated both the present performance potential and the missing parts for WebAssembly today to seamlessly run heavy-weight desktop applications and games. InfoQ interviewed Cuvillier on the technical challenges encountered , and the lessons to be learnt for developers thinking about porting desktop applications with WebAssembly.

Doom 3 is a horror first-person shooter video game originally released for Microsoft Windows in 2004. Doom 3 utilizes the id Tech 4 game engine, released under the GNU General Public License in 2011. The game was a critical and commercial success; with more than 3.5 million copies of the game sold.

AnIjYzU.jpg!web

InfoQ: What drove you to port DOOM3 to browsers with WebAssembly?

Since the general availability of WebAssembly MVP in major browsers 2 years ago, I have the feeling that a hype cycle have been started around the technology: lot of praises are being said, nice presentations and talks are being done everywhere, and so on. But, in practical terms, apart some small nice benchmarks and sample demonstrations, there have been very few real-world use cases publicly studied and shown.

So, in order to convince myself that WebAssembly could fulfill its promises, I decided to move things to the next level and port a real program.

I found the Doom 3 game to be an ideal candidate for this: it is a real-world large C++ program, a former successful AAA video game, with open-sourced code (and known to be of very good quality), and at the time the game was released - back in 2004 - it was really bleeding-edge technology in terms of game engine and graphics, known to put a lot of Desktop systems down to their knees.

Additionally, the game also have a very nice characteristic that have been been a critical point for my decision to focus on this game specifically: the id Tech 4 engine is probably one of the most advanced game engine that can be run on a single thread of execution. Or, put it differently, while nowadays most engines are designed for multiple CPU core systems, Doom 3 is one of the last “high-end” games designed to run on single CPU core system. As multithreading is not yet ready on the Web (mostly due to Spectre/Meltdown security vulnerabilities), this single-threaded feature have been a mandatory requirement from the very beginning of my projects.

So, by the end of 2018, I decided to port Doom 3 to the Web, and ultimately convince myself that WebAssembly is a technology to seriously consider for the next 10 years.

InfoQ: You mentioned that D3 is a very demanding game to run in a browser. What makes it so? What are the performance drivers which are present in native desktop operating systems and missing today in desktop browsers?

Doom 3 is a very demanding game in the browsers because 1) it is an already very demanding game by itself , and 2) it has to run with additional constraints in the Browser that are not present in native builds.

Concerning 1), as I said in the previous answer, it is a very demanding game by itself because it provides a good graphics quality level while running on single thread of execution. This means that the amount of things needed to be computed sequentially while trying to stay at 16ms per frame is simply massive: unified lighting model with dynamic per-pixel lighting and real time shadows, skeletal animation, rigid body physics, AI, networking, and so on.

In that context, and concerning 2), the first constraint is that WebAssembly is first and foremost a low level virtual machine, and so, in the end you have bytecode interpreted by a dedicated program. This can’t match the performance of a general purpose CPU executing its own native instruction set! Without a Just-In-Time compiler, the performance hit because of this is at least 2x (that’s an empirical number of course, based on my personal observations. But maybe there’s some more accurate academic studies on this topic).

The second constraint is more tricky to understand: in the context of the Web, and from the point of view of WebAssembly, everything happening in the “outside world” is more or less tied to… Javascript and ultimately to the Browser and its secure sandbox. The “outside world” includes graphics and audio APIs. So, when the Wasm code calls a graphic API, this will not directly call your graphic card driver as it is used to on a native build. Instead, a small Javascript layer is being involved each time to call the correct “Web API”, and then, the browser will transform/forward the Web API call to a graphic driver call, after having done some checks and validations. Escaping the “secure sandbox” come at a cost.

And all of this introduce a lot of overhead in the end. So if you have a program that heavily stresses both the CPU and the “external world” (such as doing many graphics API calls per frame), you end up in a very demanding situation.

InfoQ: Are you aware of other desktop games which have been ported to the browser with WebAssembly? Is the performance profile similar to the one you achieved with DOOM3?

To my knowledge, as of 2019, there is not yet another commercial game as demanding as Doom 3 that has been ported to the Web. There are some nice tech demos, but no full AAA games . Note that I might be wrong, I did not check all the Web! But - shameless plug - another nice game porting experiment of a full commercial video game is simply the previous game I ported to WebAssembly: Arx Fatalis . This is a 2002 video game, so quite uglier than D3, but nevertheless very interesting. You may test the demo there: http://wasm.continuation-labs.com/arxdemo/

The story behind this port dates back from earlier experiments I did to run native applications using a technology called Portable Native Client (PNaCl), which is more or less one of the precursors to WebAssembly. It worked nicely, but after Google decided to deprecate the technology in favor of WebAssembly, I decided to migrate the port to WebAssembly as well as a way to learn the new technology.

InfoQ: Among future features coming to WebAssembly in all modern browsers, SIMD support , Dynamic Linking , 64 bits addressing , OffscreenCanvas : pick one!

The most important thing to me now for WebAssembly is not in the list :)

It is the ability to suspend/resume the WebAssembly runtime.

The reason is difficult to explain in a short interview, but basically, you have all your typical synchronous C/C++ code embedded in a fully asynchronous environment. And that frequently does not blend well: some usually simple things can now be difficult to achieve, such as doing synchronous I/O operations.

For example, one has to realize that “synchronously reading a big file from a persistent storage” (such as a game asset from disk for example) is something which is no more possible on the Web. This very simple thing has been taken for granted since the dawn of programming, and now, it is gone! Honestly, that hurts.

Well, you do have some ways to handle the issue, but all of them come at a cost: either your artificially “blocks everything” (including the browser tab) which is not a good practice on the Web, or you rely on costly workarounds such as storing all the filesystem/assets in RAM (such a waste given the size of our persistent storage solutions) or using specific compiler flags with non-trivial requirements on the code and impacts on the compiled binary.

Of course, rewriting the synchronous code to asynchronous code is the best solution, but with huge code bases such as Doom 3, that’s simply not possible in a reasonable time.

So, allowing to suspend/resume the WebAssembly runtime could allow to re-introduce some of the synchronous I/O paradigms we are used to since 25+ years (at least, “synchronous” from the point of view of the C/C++ code).

Hopefully, the people behind the Emscripten project have worked on various compiler-side solutions that could address this problem, with funny names such as “Asyncify” and “Emterpreter”. I used them a bit in the Doom3 port to ease up the porting process. The next iteration will be probably called “Bysyncify”, and I can’t wait to test it.

Well, if I really had to pick from one item from your list, I’d take Offscreen Canvas. Though is is not related to WebAssembly, but more to the browser implementations that are not yet all implementing this feature (Safari in particular). But the truth is that this “pick” is related to the previous point: Offscreen Canvas allows to have the main WebAssembly code of a graphical application to be run in a Web Worker, and in Web Workers, the synchronous/asynchronous issues are far less relevant.

InfoQ: You spent 6-7 weeks full-time working on the port, and described the task as fairly complex. In your opinion, should developers start experimenting with porting major applications to the browser or is it wiser to wait for the technology to mature?

I find the technology to be very mature by itself, and I am quite impressed by it. For sure, if one has the need of CPU-intensive application available on the Web, there is absolutely no reason to not start working on it using WebAssembly. It could be Video Games, CAD softwares, or Big Data for example. I am currently working on a long-term project to illustrate usage of WebAssembly in 3D CAD software, and I recently did some contractual work on an upcoming product related to the last mentioned example. While I can’t disclose more for now, be sure 2020 will have a bunch of nice Wasm-related surprises :)

Additionally, the people working on WebAssembly ecosystem - be it on the compilers, Wasm runtimes or compilation environments - are very skilled in their respective field. You can expect quite a good software platform in the future.

A word of caution though: the Web with WebAssembly is a difficult to learn technological stack, with rough edges and still a quite a fast moving target. This last point may explain why it is yet difficult to have an up-to-date in-depth book about the platform. A lot can be actually done, but only with sufficient money/time/knowledge resources. But for those who are ready to give energy to this, they’ll not be disappointed for sure!


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK