-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deduplication of media files #46
Comments
Regarding the migrations: If we implement this, we should make it at least a little future-proof. The simplest solution would be to just append statements to the These are my thoughts so far. Maybe someone else hase more experience regarding SQL/Postgres/Python migrations (@peb-adr @gsiv @r-peschke)? |
I agree that we shouldn't need a third-party framework for this but I also don't think writing a simple mechanism for ourselves would be overkill, even if we aim to enable (data) migrations from any given previous version, as we should. I expect that we'll eventually need this for the other services as well. To achieve this, wouldn't it suffice to attach a version to the schema and apply the migrations, i.e., numbered SQL scripts, as necessary during the container's start-up routine? The only snag is going to be, as always, the coordination between scaled services. Unlike with the backend service, we should plan for a locking mechanism from the start this time but that, too, should be attainable. Remember that it needs to work with pgbouncer's transaction pooling mode though. |
To reduce the database size, we could save the hash of each file together with the content. If one uploads a new file, the hash of it is first calculated and checked if it is already present in the database. If it is, then the content of the old file is linked with the new id which was uploaded. This requires a new 1:n table which links ids to their content. This requires a migration which I'm not sure is possible with the current setup...
The text was updated successfully, but these errors were encountered: