Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use redis as the event storage backend #120

Open
xgkk opened this issue May 25, 2023 · 4 comments
Open

use redis as the event storage backend #120

xgkk opened this issue May 25, 2023 · 4 comments

Comments

@xgkk
Copy link

xgkk commented May 25, 2023

I want to use redis as the event storage backend. What should I do? Because send_event is used in the multi-process scenario, the client cannot receive the message consistently

@jkarneges
Copy link
Member

jkarneges commented May 25, 2023

Hi there. If you want to store events in redis instead of a SQL database, you could write your own StorageBase implementation and then set settings.EVENTSTREAM_STORAGE_CLASS. However, the problem with multiple processes isn't storage, it's distribution. The approach used by django-eventstream when there are multiple processes is to rely on Pushpin for distribution. See https://github.com/fanout/django-eventstream#multiple-instances-and-scaling

@xgkk
Copy link
Author

xgkk commented Jun 6, 2023

fine! thanks!

@acuD1
Copy link

acuD1 commented Aug 1, 2023

Hi ! I'm bumping this. So in multi processes situation, I want to use redis as Storage backend i would lets Pushpin rely on Redis instead of Django right ?

Like the documentation of Pushpin says with a Kafka exemple here: https://github.com/fanout/kafka-sse-example

@jkarneges
Copy link
Member

jkarneges commented Aug 15, 2023

In django-evenstream, storage is separate from distribution, and only storage is pluggable. You can subclass StorageBase to store messages in Redis, but distribution will continue to go directly from Django to Pushpin. This will work with multiple processes.

The Pushpin Kafka example is built differently, using a background process that reads from Kafka. It may be possible to hack django-eventstream to work similarly, by changing send_event to not send to Pushpin and then making a background process that listens to Redis for changes and reuses django-eventstream utility functions for publishing to Pushpin. But this would be tricky and I'm not sure why you'd want to do this.

One nice thing about django-eventstream compared to the Kafka example is that it doesn't have any background processes, so it can be run statelessly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants