Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gzip will cause react-dom-stream to not stream content #5

Open
geekyme opened this issue Oct 22, 2015 · 20 comments
Open

Gzip will cause react-dom-stream to not stream content #5

geekyme opened this issue Oct 22, 2015 · 20 comments

Comments

@geekyme
Copy link

geekyme commented Oct 22, 2015

image

Tried this out on a test site, I don't see the content being streamed down properly cuz if it did, you would see main-xxx.css start downloading before for-him/ finish downloading.

Am I missing some kind of proper encoding?

@aickin
Copy link
Owner

aickin commented Oct 22, 2015

Look at the response headers in your browser dev tools. If there's a header called Content-Length, then it's not being streamed. If there's a header called Transfer-Encoding with a value of chunked, then it is being streamed.

If the response is not being streamed, there are two usual culprits: middleware and proxies.

  • Middleware, particularly things like gzip, can potentially buffer the responses. I think most middleware can be tuned to buffer less, but it depends what middleware you are using.
  • Proxies or CDNs are allowed to turn streaming results into non-streaming, but that's not super common these days in my experience.

@geekyme
Copy link
Author

geekyme commented Oct 22, 2015

this is awkward. I dont see any Content-Length header. There's a Transfer-Encoding with value of chunked

@geekyme
Copy link
Author

geekyme commented Oct 22, 2015

You are right. It's gzip that is causing streaming to not work.

image

Removing the gzip middleware will fix the issue. It's too bad though, gzip is useful. :(

@geekyme geekyme closed this as completed Oct 22, 2015
@geekyme
Copy link
Author

geekyme commented Oct 22, 2015

Renamed the issue title so others may find this useful

@geekyme geekyme changed the title Doesn't appear to be streaming the content properly Gzip will cause react-dom-stream to not stream content Oct 22, 2015
@aickin
Copy link
Owner

aickin commented Oct 22, 2015

Did you use compression?

There are a lot of bug reports of people having a hard time getting compression to stream correctly, but their continuous integration tests show that streaming works with it. I've been poking at it for the last hour or so and have been having a hard time getting it to do the right thing; it could be my code's fault. I'l reopen as a tracking bug to make it work.

@aickin aickin reopened this Oct 22, 2015
@geekyme
Copy link
Author

geekyme commented Oct 22, 2015

Yes i use compression

Regardless, my servers is fronted by a load balancer which will also gzip content. If either the load balancer or compression is running gzip, then my streaming will not work.

http://stackoverflow.com/questions/5280633/gzip-compression-of-chunked-encoding-response

You gzip the content, and only then apply the chunked encoding:

"Since "chunked" is the only transfer-coding required to be understood by HTTP/1.1 recipients, it plays a crucial role in delimiting messages on a persistent connection. Whenever a transfer-coding is applied to a payload body in a request, the final transfer-coding applied MUST be "chunked". If a transfer-coding is applied to a response payload body, then either the final transfer-coding applied MUST be "chunked" or the message MUST be terminated by closing the connection. When the "chunked" transfer-coding is used, it MUST be the last transfer-coding applied to form the message-body. The "chunked" transfer-coding MUST NOT be applied more than once in a message-body."

@aickin
Copy link
Owner

aickin commented Oct 22, 2015

Right, but that quote doesn't mean that chunked encoding and gzip are incompatible; it says the opposite. They are compatible and can be used together, and the folks behind compression document it as intending to be compatible with chunked encoding. I think I need to add a few calls to flush, and maybe some guidance in how big a buffer to use in compression.

Of course, if you have a load balancer that doesn't support streaming or that doesn't allow its gzip to be tuned, that's a different issue.

@th0r
Copy link

th0r commented Oct 22, 2015

@aickin compression is working for me: the response is chunked and gzipped.
I think you don't need to add flush calls because it's user's responsibility to do so if he really needs it.

@geekyme Try to set content type of the response via res.type('html') before calling any res.write().
From compression docs:

The default filter function uses the compressible module to determine if res.getHeader('Content-Type') is compressible.

It solved my problem.

P.S. @aickin Thanks a LOT for a GREAT module!

@geekyme
Copy link
Author

geekyme commented Oct 22, 2015

Oh ok I was using res.writeHead to write the content encoding to text/html.

On 22 Oct 2015, at 5:35 PM, Yuriy Grunin [email protected] wrote:

@aickin compression is working for me: the response is chunked and gzipped.
I think you don't need to add flush calls because it's user's responsibility to do so if he really needs it.

@geekyme Try to set content type of the response via res.type('html') before calling any res.write().
From compression docs:

The default filter function uses the compressible module to determine if res.getHeader('Content-Type') is compressible.

It solved my problem.


Reply to this email directly or view it on GitHub.

@geekyme
Copy link
Author

geekyme commented Oct 22, 2015

@th0r I just tried res.type('html') with compression on. Doesn't make a difference.

@th0r
Copy link

th0r commented Oct 22, 2015

@geekyme Are your js- or css- responses gzipped?
How do your response headers look in the case of chunked response?

@geekyme
Copy link
Author

geekyme commented Oct 22, 2015

All my responses from my server are gzipped.

@roblg
Copy link

roblg commented Oct 23, 2015

FWIW, at Redfin we had an issue with the default windowBits setting for the compression module that affected our ability to stream in staging and production environments. Even though we thought we were streaming, the compression was doing some internal buffering of the response before writing it (I think to try to maximize compression?). In any event, we lowered the windowBits setting and started getting content much earlier.

Code snippet:

        // The default value for `windowBits` is 15 (32K).  This is too
        // large for our CSS/script includes to make it through before
        // we start waiting for the body.  We _really_ want to kick off
        // secondary resource requests as early as possible, so we'll
        // decrease the window size to 8K.
        //
        server.use(require('compression')({ windowBits: 13 }));

Not sure if that's contributing here, but these issues sound similar.

@geekyme
Copy link
Author

geekyme commented Oct 23, 2015

Interesting! That's something I could definitely try !

On 24 Oct 2015, at 12:01 AM, roblg [email protected] wrote:

FWIW, at Redfin we had an issue with the default windowBits setting for the compression module that affected our ability to stream in staging and production environments. Even though we thought we were streaming, the compression was doing some internal buffering of the response before writing it (I think to try to maximize compression?). In any event, we lowered the windowBits setting and started getting content much earlier.

Code snippet:

    // The default value for `windowBits` is 15 (32K).  This is too
    // large for our CSS/script includes to make it through before
    // we start waiting for the body.  We _really_ want to kick off
    // secondary resource requests as early as possible, so we'll
    // decrease the window size to 8K.
    //
    server.use(require('compression')({ windowBits: 13 }));

Not sure if that's contributing here, but these issues sound similar.


Reply to this email directly or view it on GitHub.

@SOSANA
Copy link

SOSANA commented Oct 23, 2015

Great thread guys, thanks! very help full

@aickin
Copy link
Owner

aickin commented Oct 23, 2015

Another zlib option was mentioned by @jakearchibald in the discussion of issue #2: https://github.com/jakearchibald/offline-wikipedia/blob/master/index.js#L64

@jakearchibald
Copy link

Lowering the window bits will harm compression, as it's the range that back references can operate.

@leebenson
Copy link

May be slightly off-topic, but I'd generally off-load compression to the upstream proxy (e.g. nginx or equivalent). Not only is it generally faster than Node-based compression, but you can get granular with enabling it per-route and disabling based on MIME types or other headers that the Node app might sent back.

@aickin
Copy link
Owner

aickin commented Oct 28, 2015

@leebenson agreed that's often a great setup (as long as your upstream proxy supports streaming compression, which you need to test!).

you can get granular with enabling it per-route and disabling based on MIME types

Worth noting, though, that I think compression can do this. You implement a filter method in the options you send to compression, which is called for every request, passed the request object, and returns true iff the request should use compression.

I still think you're right that an upstream proxy is usually a better choice, though.

@leebenson
Copy link

@aickin totally. Feature-wise, Node is pretty much on par. Hard to beat the speed of a proxy server written in C, though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants