-
Notifications
You must be signed in to change notification settings - Fork 279
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: Allow BlendMode::None
when applying a backdrop-filter blur
#4294
Comments
Hi John, Thanks very much for the detailed report! I have only quickly scanned this so far, but I think that there is possibly some confusion on what the semantics of The intent of this flag is an optimization hint to WR that says "even though this image may have an alpha channel, I'm guaranteeing it's all 1.0, so you can treat it as opaque safely". For example, we could then draw this image in the opaque pass, taking advantage of z-rejection. Whereas, I think the way you're interpreting it is "please ignore the alpha that is present in this image", which is a bit different. Does that sound right, or am I misinterpreting the details above? |
I see. Thanks for your time on this Glenn. Your answer quickly gets to the heart of what we want to do. So even though what he have currently is working (using |
Yep, that sounds right. There's a couple of approaches to add support for ignoring alpha in an image during render that I can think of: (1) Modify the incoming image data when it's provided to WR either during or after the upload into the texture cache. (1) is going to be much better for performance (only during init of an image), and much simpler to implement than (2), but has the downside that it would only work with images provided to WR that get stored in the texture cache (e.g. it wouldn't work with dynamically generated filter outputs, or external / native texture handles). Would those limitations be fine for your use case? If so, we could probably do a simple CPU side impl that modifies the data in the texture cache, and follow up with a more advanced impl that uses a shader if we hit any performance issues. It's unlikely I'd have time to implement this in the near future, but would be happy to write up some details of how to achieve it if you'd be interested in writing such a patch? |
Thanks Glenn. At least for our use case I think option #2 is the only way we'd be able to go. The shader swizzle you mention seems like what we want. |
Spoke with my coworkers and spent some time on this this week. I did work up a changeset for this that deviates some from the proposed solutions above, but may be worthy of feedback. I'm just going to present that in hopes it might spur some thought or suggest an alternate approach. The crux of the code I'll present does echo similar code elsewhere so it's possible we're just longing for a missing feature. If I can get some feedback on the possibility of bringing in feature similar to what I lay out (or an alternate approach) I'll just close this. I now know enough to more simply restate our current issue: Use of background blur filter on a primitive forces the use of PremultipliedAlpha during composition with the elements below. The crux of what I'm doing to avoid that is here. Ignoring my comment typos ;-) I see similar switches for some other primitives during batching. Like here and here. Of course the background blur batching seems far more complex (it makes my head spin honestly) but in theory I think the change isn't 100% crazy. The rest of the changeset I have is more hacky. I honestly don't know how to propagate that required "is_opaque" through the API and into the batching well, so I just have clients adding a flag `to the primitive. Any thoughts on that are appreciated. I was surprised I didn't get into the swizzle level suggested above, and that may yet be necessary. Reading through the code I also think there may be ways to achieve this by better leveraging the compositor support in webrender, but I'm having difficulty even exercising that for simple cases. Is there anything you can suggest that would help us understand compositor flags better? Lastly, I'll ask if you have suggestions regarding debugging webrender batching and scene trees. I've been using Thanks again. |
BlendMode::None
when applying a backdrop-filter blur
I updated the issue title I think to reflect its proper state. When starting out I wasn't sure if it was a bug or not, but it seems to be a feature, that is the ability (somehow) to allow background blurs to use |
I've been struggling with an issue related rendering images with alpha channels as opaque while also applying backdrop-filter blurs. We have some images that have alpha channel information but we'd like to render them 100% opaque, and with blurs on top. Can't quite get it to work.
I've done work recreating the issue in
wrench
and can step through the issue in the debugger, and I'm currently of the opinion that this functionality would have to be added, but thought I would walk through our case and ask if perhaps we're going about this the wrong way.Since
wrench
allows sharing of test cases very easily I've written several and they are linked below.Also forgive me for being a bit pedantic as I don't know webrender very well and am also using this as a learning exercise.
so I'm going to walk through our use case pretty slowly. It's mostly for my benefit as I don't know webrender very well. Any suggestions to achieve what we're looking for is appreciated.
Case 1 (no issue): Images with alpha and premultiplied alpha
I had to make some changes to
wrench
(which currently converts all input images to premultiplied alpha) but I just wanted to test displaying images with "normal" alpha vs premultiplied alpha. The image on the left uses standard alpha, the image on the right uses premultiplied alpha. Both are correct. Excellent. Theyaml
file I'm using for this is here.Case 2 (no issue): Specifying opacity
The next case takes the images from above and simply specifies they should be opaque (using
ImageDescriptorFlags::IS_OPAQUE
). I added the ability to specify opacity towrench
as it currently infers opacity from the image content. So, regardless of image content (presence of an alpha channel) I'd like to specify certain images to be opaque. The public API makes this possible, and the results below are correct. The dark bottom of the image on the right is because the alpha was premultiplied into the color channels. Theyaml
file for this test is here.Case 3 (no issue): Specifying a
backdrop-filter
blur over transparent imagesThe next case superimposes a blur (specifically a backdrop-filter blur) over case 1 above (images are transparent). As you'd expect, the results are correct. The
yaml
file for this test is here.Case 4 (issue): Specifying a 'backdrop-filter` blur over opaque images.
Now the problematic part. This case superimposes a
backdrop-filter
blur over images which have been tagged viaImageDescriptorFlags::IS_OPAQUE
. As can be seen the opacity is not preserved in this case. I'd expect the color bars beneath the blur should look more like Case 2 above rather than Case 1 Theyaml
script for this test is here.Summary:
Do you think there is another way to achieve the desired effect? I've iterated quite a bit on using different blend modes and stacking contexts (
wrench
makes this very easy) but the tagged image opacity seems to be lost when batching the complex rendering passes required for the backdrop-filter blur.I'd also be interested in any suggestions for ways to debug this (for example visualizing the various passes done for backdrop-filter blurs) and perhaps getting more info in how the batches are formed.
Appreciate any help you can provide.
The text was updated successfully, but these errors were encountered: