-
-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rewrite InMemoryBroker over aiochannel #92
base: develop
Are you sure you want to change the base?
Conversation
Codecov Report
📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more @@ Coverage Diff @@
## master #92 +/- ##
==========================================
+ Coverage 61.94% 66.55% +4.61%
==========================================
Files 37 37
Lines 938 930 -8
==========================================
+ Hits 581 619 +38
+ Misses 357 311 -46
... and 5 files with indirect coverage changes 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
I don't think that this PR is going to be merged in the way it exists now. For two reasons. First of all, where does all processing happens? I guess now you need to start My proposal would be to create a separate broker that allows you to run code using memory channels. Maybe Also, you need to update docs. Because currently in docs there's no single word about you need to start async listen somewhere in your codebase explicitly. |
|
Before this PR you could write something like this: broker = AioPikaBroker("...")
if os.environ.get("APP_ENV", "dev") == "dev":
broker = InMemoryBroker() Now you don't need to modify your code in any other places, you don't have to explicitly starting listening task somewhere. It's just a convenient interface to use if you don't want to start your distributed queues like rabbit or redis or any other queue. Also it's suitable for testing your functions that use tasks.
|
No description provided.