-
-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(fetch): support Content-Encoding
response header
#604
Conversation
Thanks, @mikicho! I believe the answers to both of your questions are in setting up an HTTP server and taking a look what happens to compressed bodies sent from the server. We can request the same endpoint in the browser and in Node.js, and see whether they handle that at all. But overall, yes, Interceptors ships browser interceptors, so we can't run things like |
const compressed = zlib.brotliCompressSync(zlib.gzipSync(message)) | ||
|
||
interceptor.once('request', ({ controller }) => { | ||
controller.respondWith(new Response(compressed, { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For mocked responses, you most definitely need a third-party. Interceptors mustn't meddle with your input based on any headers.
I'm mostly curious about receiving a compressed request in Node.js.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm mostly curious about receiving a compressed request in Node.js.
I hope I understand your question correctly but:
Fetch: decompress the response according to the content encoding header.
IncomingMessage: return the body as-is
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it! That seems to be the case, although we may want to add integration tests for the ClientRequest
interceptor to verify it. I do believe it's going to expose the response body as-is and it's up to the developer to decompress it.
The fetch client has decompression built-in, I can confirm it by looking at the source of Undici.
Some related stuff I found: |
Gave this a try in Node.js: import fs from 'fs'
import http from 'http'
import zlib from 'zlib'
const stream = fs.createReadStream('./payload.txt')
const server = http.createServer((req, res) => {
// Compress the response body.
if (req.method === 'GET' && req.url === '/resource') {
res.statusCode = 200
res.writeHead(200, {
'content-type': 'text/plain',
'content-encoding': 'gzip',
})
stream.pipe(zlib.createGzip()).pipe(res)
return
}
res.statusCode = 404
res.end()
})
server.listen(3000, async () => {
console.log('Server is running on http://localhost:3000')
const response = await fetch('http://localhost:3000/resource', {
headers: { 'Accept-Encoding': 'gzip' },
})
console.log(await response.text())
}) I receive the file's content decoded. I don't do the compression though. |
What do you mean? You do compress the response in the server |
@kettanaito I added an unmocked test which passes. |
Thank you, @mikicho! Will look at this. |
ad6cf89
to
62b4f09
Compare
We do need to add If not, we can integrate their handling into Interceptors manually, although that'd be a repetition I'd rather not introduce. |
Content-Encoding
response header
UpdateI realized we still need to duplicate the Undici's decompression handling because it's unlikely to be backported to Node v18 and even v20, at this point. I certainly don't have the capacity to see those backports through. The handling itself is rather straightforward, it's just nice to be consistent. We will afford that nicety once Node.js v22 becomes the minimal supported version across MSW's toolchain. @mikicho, if you have time, can you give me a hand with this? I've already abstracted the decompression logic in the Unidic's PR, making it work better with We basically need this |
We can also consider using the standard Compression Stream API instead of relying on Undici. I will explore this option. |
UpdateOpened a pull request that utilizes the Compression Streams API to decompress fetch responses: #661. This works for GZIP and Deflate, but has no Brotli support. @mikicho, would be curious to hear your thoughts on this one. Edit: Supporting Brotli seems like a lot of work. We can either release decoding without it (just GZIP and Deflate), and work on it in the background, or block this feature until we find a way to make Brotli happen in Node.js and the browser (it is possible via |
@kettanaito I think we should check the environment, and if running in Node.js, use |
@@ -29,6 +35,12 @@ const browserConfig: Options = { | |||
format: ['cjs', 'esm'], | |||
sourcemap: true, | |||
dts: true, | |||
esbuildOptions(options) { | |||
options.alias = { | |||
[`internal:brotli-decompress`]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the environment-based decompression for the build.
@@ -14,7 +14,10 @@ | |||
"types": ["@types/node"], | |||
"baseUrl": ".", | |||
"paths": { | |||
"_http_common": ["./_http_common.d.ts"] | |||
"_http_common": ["./_http_common.d.ts"], | |||
"internal:brotli-decompress": [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the environment-based decompression for the dev (TS).
alias: { | ||
// Create a manual alias for Vitest so it could resolve this | ||
// internal environment-dependent module in tests. | ||
'internal:brotli-decompress': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the environment-based decompression for the tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me.
@mikicho, please, would you have a moment to browser through the changes? Would love to hear your thoughts too.
@kettanaito LGTM Just curious about |
@mikicho, I added Do you see a better approach here? |
Ohh Ok! I see.. |
It's good to be consistent across the browser and Node.js, and the |
@kettanaito I think there is something with the decompression order.. still looking to understand the root cause (if any) |
Right now, we're applying decompression left-to-right based on the Maybe that's the problem? Basically:
The decompression chain would be: input -> DecompressionStream('gzip') -> BrotliDecompression -> output. |
This behavior looks correct; I'm not sure the code does it. From what I see: compressResponse(['gzip', 'br'])('hello world').toString('base64')); // H4sIAAAAAAAAA+NmbchIzcnJVyjPL8pJYQYAhavvgA8AAAA= Produce a string that is br -> gzip, instead of gzip -> be In contrary: zlib.brotliCompressSync(zlib.gzipSync('hello world')).toString('base64'); // Cw+AH4sIAAAAAAAAA8tIzcnJVyjPL8pJAQCFEUoNCwAAAAM= |
Your example is incorrect. Decompression must be reversed, and there's a bug currently that I will fix soon.
|
IIRC,
I saw that too lol. Seems like |
Why not? If I compressed by |
I think we mean different things by Content-Encoding: gzip, br
br(gzip(data)) @mikicho, I've pushed the fix that applies the decompression right-to-left to respect the right order of multiple compressions. Mocked tests pass, but I have two bypass tests failing:
The error seems to be coming from Undici unable to handle that compression:
Our interceptor is not involved here at all. Am I constructing an invalid compressed response with I can confirm that decompressResponse: eJyT7+ZgAAPh0x5nT54M1zivf8qTkaFV0IuXGygMAICXB8E=
zlib: eJyT7+ZgAAPh0x5nT54M1zivf8qTkaFV0IuXGygMAICXB8E= |
I fixed the The issue is in Undici, I can reproduce it standalone as well. Will report it to their repo. Edit: Opened an issue at nodejs/undici#3762. |
Since the behavior is reproducible in isolation, our side of the implementation seems to be done. The intention of bypass tests is to ensure the interceptor doesn't interfere with the regular behavior of the request client, but in this case the request client has a bug so we cannot allow that to influence our tests. I will skip those two scenarios and link the issue above them. Will do that after we discuss the issue with the Undici folks. |
This looks good to me. @mikicho, should we merge this? |
🚀 |
Released: v0.36.6 🎉This has been released in v0.36.6! Make sure to always update to the latest version ( Predictable release automation by @ossjs/release. |
Added failing test.
@kettanaito
interceptors
is cross-platform and AFAIKzlib
does not exist in the browser. How does the browser decompress the response body? Am I missing something?IncomingMessage
does not decompress the response body, and Node.js leaves this for the HTTP libraries (Axios, got, etc.). So, I think we should follow this behavior and leave the response body compressed.Summary
zlib
and a custom transform stream to support Brotli decompression in Node.js.