You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When calling Cm_Cache_Backend_Redis::clean(Zend_Cache::CLEANING_MODE_MATCHING_TAG, ['bla']) this runs out of memory when there are a large number of entries.
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 33554440 bytes) in /…/vendor/colinmollenhour/credis/Client.php on line 1353
v1.17.1
We could of course increase the memory limit but it'll fail eventually; would it be possible to do this in batches, or via a Lua script?
The text was updated successfully, but these errors were encountered:
MATCHING_TAG uses AND logic and MATCHING_ANY_TAG uses OR logic. I'm not sure what a use-case is for the former, perhaps you actually want the latter? The latter uses Lua already so should work with very large limits.
This way we can remove various combinations of caches - and sometimes we do need AND.
We have some CLI scripts now that do something like redis-cli sinter product123 reviews | <transform to unlink key> | redis-cli, but that is a bit cumbersome.
Would it be possible to do the same with MATCHING_TAG as with MATCHING_ANY_TAG? A possible solution could be that if only 1 tag is specified, Cm_Cache_Backend_Redis::clean uses _removeByMatchingAnyTags when specifying Zend_Cache::CLEANING_MODE_MATCHING_TAG.
Yes, it would be possible to write a Lua function that clears keys using SINTER so it can be done server-side. I'm not in a place to do that but would be happy to accept a PR.
The single-element optimization seems quite simple, but you could also do that optimization in the app.
Bumping your memory limit seems actually pretty sane to do if you're needing an urgent fix - if the size of your data set grows endlessly you are probably going to run into other issues eventually.
When calling
Cm_Cache_Backend_Redis::clean(Zend_Cache::CLEANING_MODE_MATCHING_TAG, ['bla'])
this runs out of memory when there are a large number of entries.v1.17.1
We could of course increase the memory limit but it'll fail eventually; would it be possible to do this in batches, or via a Lua script?
The text was updated successfully, but these errors were encountered: