Google’s Chrome Beta 94 announcement mentions that Google is implementing some new web standards that could make browser-based gaming experiences even better. The soon-to-be-released WebCodecs could help make cloud gaming easier and faster, while the experimental WebGPU could make it easier for developers of games that run in the browser to tap into your computer’s power.
WebCodecs is an API designed to give developers better access to the video encoding/decoding codecs that are already bundled with your browser, which figure out what to do with video streams. While there are already methods of getting video to play in Chrome, they’re not necessarily designed for things like cloud gaming, which is best when it’s as low-latency as possible. WebCodecs is built to avoid overhead, making it easier to get the incoming video stream onto your screen as fast as possible, potentially with the help of hardware decoding. This will also, in theory, make it perform better than it currently does on slower machines (which are the kinds of computers where cloud gaming is most desirable anyhow).
The newer, more experimental WebGPU gives web developers better access to your computer’s graphics horsepower, by letting them hook into your computer’s native graphics API (similar to Apple’s Metal, Microsoft’s DirectX 12, or Vulkan). In simpler terms, it makes it easier for web developers to talk to your graphics card in a language it understands, without having to go through other layers that could slow things down. It’s meant to be a next-generation version of WebGL, which lets developers tap into the (now reasonably out of date) OpenGL framework. In the future, the tech should make it easier for developers to make graphically intense games that run in the browser, tapping into the full power of current-generation GPUs.
Both technologies have their place outside of gaming too. In a July 2020 talk, Google mentioned that Zoom was interested in using WebCodecs for videoconferencing, and WebGPU could be used to render 3D models in the browser or to accelerate machine learning models. It makes sense that they’d show up in Chrome, as these are all areas Google plays in, from cloud gaming with Google Stadia, to its own video conferencing apps. Both pieces of tech are open standards though, developed by the W3C, and other browser makers have begun testing them as well.
Of course, we probably won’t be seeing experiences powered by WebCodecs or WebGPU for a little while. While WebCodecs is actually getting close to release (it’s expected to be turned on by default in the upcoming Chrome 94), developers will still have to make their apps work with it. As for WebGPU, it’s currently in its experimental trial phase, which Google expects to end in early 2022. Whether it’ll end up as a feature at that point depends on how the trial goes, if the specification is done, and if enough people are interested in using it.
While these technologies may not make things that were impossible possible, they’re exciting nonetheless. When things are easier, or more flexible, it lowers the barrier of entry for developers. For gamers looking to play on the web, either through streaming or native games, the time developers save on figuring out how to get frames onto your screen is time they can spend making other parts of the experience better.