Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

_removeByMatchingTags out of memory with large number of matching entries #183

Open
michielroding opened this issue Sep 5, 2024 · 3 comments

Comments

@michielroding
Copy link

When calling Cm_Cache_Backend_Redis::clean(Zend_Cache::CLEANING_MODE_MATCHING_TAG, ['bla']) this runs out of memory when there are a large number of entries.

PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 33554440 bytes) in /…/vendor/colinmollenhour/credis/Client.php on line 1353

v1.17.1

We could of course increase the memory limit but it'll fail eventually; would it be possible to do this in batches, or via a Lua script?

@colinmollenhour
Copy link
Owner

MATCHING_TAG uses AND logic and MATCHING_ANY_TAG uses OR logic. I'm not sure what a use-case is for the former, perhaps you actually want the latter? The latter uses Lua already so should work with very large limits.

@michielroding
Copy link
Author

No, we want the former.

For example, imagine the following sets of tags:

['product123', 'reviews']
['product123', 'shippingcost']
['product456', 'shippingcost']

This way we can remove various combinations of caches - and sometimes we do need AND.

We have some CLI scripts now that do something like redis-cli sinter product123 reviews | <transform to unlink key> | redis-cli, but that is a bit cumbersome.

Would it be possible to do the same with MATCHING_TAG as with MATCHING_ANY_TAG? A possible solution could be that if only 1 tag is specified, Cm_Cache_Backend_Redis::clean uses _removeByMatchingAnyTags when specifying Zend_Cache::CLEANING_MODE_MATCHING_TAG.

@colinmollenhour
Copy link
Owner

Yes, it would be possible to write a Lua function that clears keys using SINTER so it can be done server-side. I'm not in a place to do that but would be happy to accept a PR.
The single-element optimization seems quite simple, but you could also do that optimization in the app.
Bumping your memory limit seems actually pretty sane to do if you're needing an urgent fix - if the size of your data set grows endlessly you are probably going to run into other issues eventually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants