Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Allow BlendMode::None when applying a backdrop-filter blur #4294

Open
johnoneil opened this issue Mar 10, 2021 · 6 comments

Comments

@johnoneil
Copy link

I've been struggling with an issue related rendering images with alpha channels as opaque while also applying backdrop-filter blurs. We have some images that have alpha channel information but we'd like to render them 100% opaque, and with blurs on top. Can't quite get it to work.
I've done work recreating the issue in wrench and can step through the issue in the debugger, and I'm currently of the opinion that this functionality would have to be added, but thought I would walk through our case and ask if perhaps we're going about this the wrong way.
Since wrench allows sharing of test cases very easily I've written several and they are linked below.
Also forgive me for being a bit pedantic as I don't know webrender very well and am also using this as a learning exercise.

so I'm going to walk through our use case pretty slowly. It's mostly for my benefit as I don't know webrender very well. Any suggestions to achieve what we're looking for is appreciated.

Case 1 (no issue): Images with alpha and premultiplied alpha

I had to make some changes to wrench (which currently converts all input images to premultiplied alpha) but I just wanted to test displaying images with "normal" alpha vs premultiplied alpha. The image on the left uses standard alpha, the image on the right uses premultiplied alpha. Both are correct. Excellent. The yaml file I'm using for this is here.
case1 alpha yaml

Case 2 (no issue): Specifying opacity

The next case takes the images from above and simply specifies they should be opaque (using ImageDescriptorFlags::IS_OPAQUE). I added the ability to specify opacity to wrench as it currently infers opacity from the image content. So, regardless of image content (presence of an alpha channel) I'd like to specify certain images to be opaque. The public API makes this possible, and the results below are correct. The dark bottom of the image on the right is because the alpha was premultiplied into the color channels. The yaml file for this test is here.
case2 opaque yaml

Case 3 (no issue): Specifying a backdrop-filter blur over transparent images

The next case superimposes a blur (specifically a backdrop-filter blur) over case 1 above (images are transparent). As you'd expect, the results are correct. The yaml file for this test is here.
case3 alpha-with-blur yaml

Case 4 (issue): Specifying a 'backdrop-filter` blur over opaque images.

Now the problematic part. This case superimposes a backdrop-filter blur over images which have been tagged via ImageDescriptorFlags::IS_OPAQUE. As can be seen the opacity is not preserved in this case. I'd expect the color bars beneath the blur should look more like Case 2 above rather than Case 1 The yaml script for this test is here.
case4 opaque with blur

case4 opaque with blur

Summary:

Do you think there is another way to achieve the desired effect? I've iterated quite a bit on using different blend modes and stacking contexts (wrench makes this very easy) but the tagged image opacity seems to be lost when batching the complex rendering passes required for the backdrop-filter blur.
I'd also be interested in any suggestions for ways to debug this (for example visualizing the various passes done for backdrop-filter blurs) and perhaps getting more info in how the batches are formed.

Appreciate any help you can provide.

@gw3583
Copy link
Contributor

gw3583 commented Mar 10, 2021

Hi John,

Thanks very much for the detailed report!

I have only quickly scanned this so far, but I think that there is possibly some confusion on what the semantics of IS_OPAQUE in the image descriptor means.

The intent of this flag is an optimization hint to WR that says "even though this image may have an alpha channel, I'm guaranteeing it's all 1.0, so you can treat it as opaque safely". For example, we could then draw this image in the opaque pass, taking advantage of z-rejection.

Whereas, I think the way you're interpreting it is "please ignore the alpha that is present in this image", which is a bit different.

Does that sound right, or am I misinterpreting the details above?

@johnoneil
Copy link
Author

I see. Thanks for your time on this Glenn. Your answer quickly gets to the heart of what we want to do. So even though what he have currently is working (using IS_OPAQUE in a way it's not really intended to be used) we really shouldn't rely on that.
I have one additional follow up question. Let's say we wanted to present images with arbitrary alpha information present but wanted to ignore that information during render (in situations like above, sometimes with blurs etc.) Might we be able to achieve this via some kind of custom filter (or shader???) say a filter which takes the image information and just renders its color information as is, effectively mimicing an opaque image or blending being disabled.
Any thoughs you might have are appreciated. As you answered my question above I'll probably close this in a day or two.

@gw3583
Copy link
Contributor

gw3583 commented Mar 11, 2021

Yep, that sounds right.

There's a couple of approaches to add support for ignoring alpha in an image during render that I can think of:

(1) Modify the incoming image data when it's provided to WR either during or after the upload into the texture cache.
(2) Add support for ignoring alpha explicitly during rendering (e.g. via some kind of shader swizzle during texture sampling).

(1) is going to be much better for performance (only during init of an image), and much simpler to implement than (2), but has the downside that it would only work with images provided to WR that get stored in the texture cache (e.g. it wouldn't work with dynamically generated filter outputs, or external / native texture handles).

Would those limitations be fine for your use case? If so, we could probably do a simple CPU side impl that modifies the data in the texture cache, and follow up with a more advanced impl that uses a shader if we hit any performance issues. It's unlikely I'd have time to implement this in the near future, but would be happy to write up some details of how to achieve it if you'd be interested in writing such a patch?

@johnoneil
Copy link
Author

Thanks Glenn. At least for our use case I think option #2 is the only way we'd be able to go. The shader swizzle you mention seems like what we want.
I'd be happy to write this. Let me talk it over with my coworkers and see if we can come up with some specifics and whether they think this is worth investing in.

@johnoneil
Copy link
Author

Spoke with my coworkers and spent some time on this this week. I did work up a changeset for this that deviates some from the proposed solutions above, but may be worthy of feedback.

I'm just going to present that in hopes it might spur some thought or suggest an alternate approach. The crux of the code I'll present does echo similar code elsewhere so it's possible we're just longing for a missing feature. If I can get some feedback on the possibility of bringing in feature similar to what I lay out (or an alternate approach) I'll just close this.

I now know enough to more simply restate our current issue: Use of background blur filter on a primitive forces the use of PremultipliedAlpha during composition with the elements below.

The crux of what I'm doing to avoid that is here. Ignoring my comment typos ;-) I see similar switches for some other primitives during batching. Like here and here. Of course the background blur batching seems far more complex (it makes my head spin honestly) but in theory I think the change isn't 100% crazy.

The rest of the changeset I have is more hacky. I honestly don't know how to propagate that required "is_opaque" through the API and into the batching well, so I just have clients adding a flag `to the primitive.

Any thoughts on that are appreciated. I was surprised I didn't get into the swizzle level suggested above, and that may yet be necessary. Reading through the code I also think there may be ways to achieve this by better leveraging the compositor support in webrender, but I'm having difficulty even exercising that for simple cases. Is there anything you can suggest that would help us understand compositor flags better?

Lastly, I'll ask if you have suggestions regarding debugging webrender batching and scene trees. I've been using wrench which is good for mocking up test cases, but I really need something to visualize the process of batching and building the scene. That might be a tall order but that background blur is fairly complex.

Thanks again.

@johnoneil johnoneil changed the title Preserving image opacity when applying a backdrop-filter blur Feature Request: Allow BlendMode::None when applying a backdrop-filter blur Apr 8, 2021
@johnoneil
Copy link
Author

I updated the issue title I think to reflect its proper state. When starting out I wasn't sure if it was a bug or not, but it seems to be a feature, that is the ability (somehow) to allow background blurs to use BlendMode::None rather than strictly BlendMode::PremultipliedAlpha.
I doubt there's many people clamoring for this specific feature besides us, but I thought I'd ping this issue one more time. If it's going nowhere could it be backlogged?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants