-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gzip will cause react-dom-stream to not stream content #5
Comments
Look at the response headers in your browser dev tools. If there's a header called If the response is not being streamed, there are two usual culprits: middleware and proxies.
|
this is awkward. I dont see any |
Renamed the issue title so others may find this useful |
Did you use There are a lot of bug reports of people having a hard time getting compression to stream correctly, but their continuous integration tests show that streaming works with it. I've been poking at it for the last hour or so and have been having a hard time getting it to do the right thing; it could be my code's fault. I'l reopen as a tracking bug to make it work. |
Yes i use Regardless, my servers is fronted by a load balancer which will also gzip content. If either the load balancer or http://stackoverflow.com/questions/5280633/gzip-compression-of-chunked-encoding-response
|
Right, but that quote doesn't mean that chunked encoding and gzip are incompatible; it says the opposite. They are compatible and can be used together, and the folks behind Of course, if you have a load balancer that doesn't support streaming or that doesn't allow its gzip to be tuned, that's a different issue. |
@aickin @geekyme Try to set content type of the response via
It solved my problem. P.S. @aickin Thanks a LOT for a GREAT module! |
Oh ok I was using res.writeHead to write the content encoding to text/html.
|
@th0r I just tried |
@geekyme Are your js- or css- responses gzipped? |
All my responses from my server are gzipped. |
FWIW, at Redfin we had an issue with the default Code snippet: // The default value for `windowBits` is 15 (32K). This is too
// large for our CSS/script includes to make it through before
// we start waiting for the body. We _really_ want to kick off
// secondary resource requests as early as possible, so we'll
// decrease the window size to 8K.
//
server.use(require('compression')({ windowBits: 13 })); Not sure if that's contributing here, but these issues sound similar. |
Interesting! That's something I could definitely try !
|
Great thread guys, thanks! very help full |
Another zlib option was mentioned by @jakearchibald in the discussion of issue #2: https://github.com/jakearchibald/offline-wikipedia/blob/master/index.js#L64 |
Lowering the window bits will harm compression, as it's the range that back references can operate. |
May be slightly off-topic, but I'd generally off-load compression to the upstream proxy (e.g. nginx or equivalent). Not only is it generally faster than Node-based compression, but you can get granular with enabling it per-route and disabling based on MIME types or other headers that the Node app might sent back. |
@leebenson agreed that's often a great setup (as long as your upstream proxy supports streaming compression, which you need to test!).
Worth noting, though, that I think I still think you're right that an upstream proxy is usually a better choice, though. |
@aickin totally. Feature-wise, Node is pretty much on par. Hard to beat the speed of a proxy server written in C, though. |
Tried this out on a test site, I don't see the content being streamed down properly cuz if it did, you would see main-xxx.css start downloading before for-him/ finish downloading.
Am I missing some kind of proper encoding?
The text was updated successfully, but these errors were encountered: