You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
just was pointed to this spec by Kai Ninomiya. At Unity3D we notice time and time again that it would be great to have built-in browser provided support for compressing and decompressing. Our wish list:
access to gzip, deflate and brotli
ability to do both compression and decompression
ability to do streamed decompression
ability to do synchronous compression in Workers
Some of the use cases that we have had:
Unity3D employs large binary files for bundled game assets (a ".data" file blob typically for 50MB-100MB per game). This asset must be compressed to ensure fast download times, but we do not want to compress it on-demand on the web server, like other .html and .js content might be done, so we do "pre-compressed" asset files. In this scheme, we typically name the file on disk as "game.data.br" or "game.data.gz", and precompress it in advance. Then we demand that the user configures their web server to serve the file with Content-Encoding: gzip (or br).
This way server's on-demand compression is not needed, to avoid taxing the server CPU and its compression caches, and users get a compressed file over the wire.
However it is very common with Unity3D game developers that when it comes a time to host a game, they do not have access to configuring web servers arbitrarily. This means that they are unable, either due to not being web admins, or due to not having the knowledge, to add the Content-Encoding header. This means that Unity will download gzipped asset files directly into the engine.
We uncompress the file in Wasm, but experience shows that this is slower than what native browser side could do, and we have lost streamed decompression due to not having an appropriate streaming decompression in place (this could be implemented, but never quite had the cycles to do so).
If the browser exposed support for browser's own decompressors, then we could just feed the file over to that, and not have to worry about a performance loss, even if the user was not able to set up the header fields correctly.
Tiny Unity https://unity.com/solutions/instant-games is a piece of technology that enables executing web games in a very small shipped footprint. The small size is largely achieved by reusing the codecs that a browser already ships: instead of compiling libpng, we reuse browser's image decoding capabilities. Web Audio decompression instead of shipping a compiled libmp3/libvorbis. Browser fonts instead of shipping a compiled Freetype 2. And so on.
Though for compression Tiny Unity does not have a solution, and hence we compile in zlib. It is not the largest in size, but it would be nice to be able to remove that - and in particular have the browser provide a brotli codec for the page to use. Shipping one codec with the engine is tough, but having to ship multiple is even tougher, which is causing friction to attempt to update.
In multithreaded Wasm SharedArrayBuffer based applications, we find the desire to create web workers that perform computational tasks for the main thread. In our setup, we have a task queue in the Wasm SAB, and the workers keep grabbing tasks from the queue as soon as they arrive. One of these tasks includes zip compression or decompression.
In this kind of setup, it would be very useful to be able to synchronously run a compression or a decompression activity in a Worker. This way the Worker could retain its callstack context in connection with the Wasm SAB work queue, rather than having to structure the programs to yield to the Worker's event loop in order to be able to process other functions. Yielding has been observed to worsen latency in real-time rendering applications, which target interactive frame rates.
Synchronous compression and decompression is naturally not expected to work on the main browser thread.
So I am curious what is the latest status of this spec, and I wonder how the above use cases would mesh in with the existing spec?
Thanks!
The text was updated successfully, but these errors were encountered:
Thank you for the information about use cases! This is very helpful!
access to gzip, deflate and brotli
gzip and deflate are shipping in Chromium.
brotli is problematic because Chromium doesn't currently link the compression dictionary, so there would be a binary size penalty to adding it. Shipping only decompression without compression would probably be confusing to developers.
There's a bit of a chicken-and-egg problem here, in that the CompressionStream APIs don't have enough usage to justify big increases in binary size, but without access to brotli there may not be enough motivation for developers to adopt them.
I've been resistant to doing synchronous operations because of the risk of creating a poor user experience when the data is larger than usual. Making the facility only available in Workers may be a good compromise. I filed #38 for further discussion.
Hey,
just was pointed to this spec by Kai Ninomiya. At Unity3D we notice time and time again that it would be great to have built-in browser provided support for compressing and decompressing. Our wish list:
Some of the use cases that we have had:
Content-Encoding: gzip
(orbr
).This way server's on-demand compression is not needed, to avoid taxing the server CPU and its compression caches, and users get a compressed file over the wire.
However it is very common with Unity3D game developers that when it comes a time to host a game, they do not have access to configuring web servers arbitrarily. This means that they are unable, either due to not being web admins, or due to not having the knowledge, to add the Content-Encoding header. This means that Unity will download gzipped asset files directly into the engine.
We uncompress the file in Wasm, but experience shows that this is slower than what native browser side could do, and we have lost streamed decompression due to not having an appropriate streaming decompression in place (this could be implemented, but never quite had the cycles to do so).
If the browser exposed support for browser's own decompressors, then we could just feed the file over to that, and not have to worry about a performance loss, even if the user was not able to set up the header fields correctly.
Though for compression Tiny Unity does not have a solution, and hence we compile in zlib. It is not the largest in size, but it would be nice to be able to remove that - and in particular have the browser provide a brotli codec for the page to use. Shipping one codec with the engine is tough, but having to ship multiple is even tougher, which is causing friction to attempt to update.
In this kind of setup, it would be very useful to be able to synchronously run a compression or a decompression activity in a Worker. This way the Worker could retain its callstack context in connection with the Wasm SAB work queue, rather than having to structure the programs to yield to the Worker's event loop in order to be able to process other functions. Yielding has been observed to worsen latency in real-time rendering applications, which target interactive frame rates.
Synchronous compression and decompression is naturally not expected to work on the main browser thread.
So I am curious what is the latest status of this spec, and I wonder how the above use cases would mesh in with the existing spec?
Thanks!
The text was updated successfully, but these errors were encountered: