The Payload Spooler
Send your payload now, treat it later.
Payler is an asyncio-based Python application intended to provide a way of delaying message execution. The goal of this program is to reduce the workload on your existing message broker solution (Only RabbitMQ is currently supported, but other message-brokers can be easily implemented) by putting the payloads in a storage backend which will then be polled to re-inject payloads in the corresponding destination.
Through pypi:
$ pip install payler
Through poetry:
$ git clone https://github.com/tbobm/payler
$ cd payler
$ poetry install
Using the command line:
- Specify the input and output URLs for your drivers (see configuration)
- (optional) Customize the configuration to suit your needs currently the example configuration is the only valid one
- Run payler
payler --config-file configuration.yaml
Using the docker image:
- Pull the docker image
docker pull ghcr.io/tbobm/payler:latest
- (optional) Customize the configuration to suit your needs currently the example configuration is the only valid one (mount the configuration file into the volume at
/configuration.yaml
) - Run the docker image and provide environment variables
docker run -d --name payler -e BROKER_URL="amqp://payler:secret@my-broker/" -e MONGODB_URL="mongodb://payler:secret@my-mongo/payler" ghcr.io/tbobm/payler
In order to configure the different workflows, payler uses a configuration file (see configuration.yml).
Example config file:
---
workflows:
- name: "Fetch payloads from RabbitMQ and store them in MongoDB"
location: "payler"
callable: "client.process_queue"
- name: "Re-injects payloads to RabbitMQ"
callable: "client.watch_storage"
The workflows[].name
attribute is currently unused, but will offer a more human-friendly way of getting informed about a workflow's state.
The workflows[].location
corresponds to the package where the workflows[].callable
can be found. It defaults to payler
, but can this is a way of offering a dumb and simple plugin mechanism by creating function matching the following signature:
async def my_workflow(loop: asyncio.AbstractEventLoop) -> None:
"""My user-defined workflow."""
# configure your driver(s)
input_driver.serve()
- Forward messages between multiple datasources
- Based on asyncio (benchmarks are on the roadmap)
- Extend using your own implementation of the
BaseDriver
class
driver | process | serve |
---|---|---|
BrokerManager | Send a Payload to a Queue |
Consume a queue's messages |
SpoolerManager | Store a Payload in a Collection |
Fetch documents with a specific reference data |
This project has unittests with pytest.
You can run the tests using:
poetry run pytest
Feel free to open new issues for feature requests and bug reports in the issue page and even create PRs if you feel like it.
This project is linted with pylint
with some minor adjustments (see the pyproject.toml).
This side-project is born from the following:
- I wanted to experiment with Python's
asyncio
- A friend of mine had issues with delaying lots of messages using RabbitMQ's delayed exchange plugin
- I was looking for a concrete use-case to work with Github Actions.