-
Notifications
You must be signed in to change notification settings - Fork 483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Series cleanup #3326
Series cleanup #3326
Conversation
Actually, I will also need to update any Feeds which reference one of the deleted series. |
If you want to force the (libraryID, name) pair to be unique, it would be good to add a unique constraint to the series table. |
I haven't looked into this yet but at a glance I think we should call this from the |
Oh, can we add the unique constraint after a column is created? I'm still not very familiar with table creation/modification. Yeah, if we want to move this to another file for the database creation, we can close this PR. I spent some time last night trying to get the original query to sort by how many feeds reference the series ID, then feed creation time, then series creation time to ensure that if a feed exists for one of the series, we prioritize that one. I couldn't figure out the syntax. |
Unfortunately when I set up Sequelize I didn't take the time to add the migrations table to handle db updates. Ideally when doing a table update you also supply a path to revert the changes when downgrading versions. Currently all the table updates are one-way and were all done around v2.3.0. I'd like to set up the migrations table before doing any more one-way table updates like that. |
Yeah, that should be possible (at least in theory). You just add the following to your series model init: static init(sequelize) {
super.init(
{
...
},
{
sequelize,
modelName: 'series',
indexes: [
...
// New multi-column index
{
unique: true, // Add the unique constraint
fields: ['libraryId', 'name']
}
]
}
); But as @advplyr asks, it should be done as part of a two-way migration setup. |
@advplyr, @nichwall, would you like me to take a stab at creating a migration setup? From what I read, I understand I doubt we have many users still running v2.3.0 at this point, so it would probably be safe to start with the current DB schema as baseline - please let me know what you think. |
Yeah that would be great. I looked into it briefly and set up a test so I'm somewhat familiar with it. I agree that we can start with the current DB schema |
Yeah, that sounds like a good plan. Do these migrations work for arbitrary steps? Like if I said "Make a new table that contains all of the tags", could we create the new table and populate it from the JSON in the books table, and then when reverting it would delete that table and add the data back in the JSON? Or is it just adding/removing columns and tables that we can do? |
You can write whatever you want in the migration, and it's fully up to you
tobdecide and implement the migration's up and down semantics.
…On Sun, Aug 25, 2024, 18:13 Nicholas W ***@***.***> wrote:
Yeah, that sounds like a good plan. Do these migrations work for arbitrary
steps? Like if I said "Make a new table that contains all of the tags",
could we create the new table and populate it from the JSON in the books
table, and then when reverting it would delete that table and add the data
back in the JSON? Or is it just adding/removing columns and tables that we
can do?
—
Reply to this email directly, view it on GitHub
<#3326 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFMDFVS4VLT667IQAZNOT7TZTHYB7AVCNFSM6AAAAABM7MNGMCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMBYHA4TAOBZGM>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
@nichwall, @advplyr, can you please take a look at and review a small design doc I've written for this? This is a sensitive feature, so I want to make sure I'm doing it correctly. I think the main question I'd like your input on is what to do on migration failures, but please let me know if you have any other comments or questions. I also have the code mostly ready as well - I'll send a PR once you've reviewed and approved the design. |
Now that the migrations are implemented in #3378, do you want me to update this PR to instead use the migration for making the column |
From my perspective, please go ahead and update. It would be good to have someone other than myself try to write and test a migration script based on the instructions. I'll be happy to review the code. |
Sounds good. I'll try and get to that soon but it may be a few days. |
- Additional validation on API endpoints - Removed success toast when reorder libraries
Removes a lot of unused (in ABS) functionality, refactors to ES6 style class, and re-enables this custom implementation with check period and ttl of 1 day, and 1000 max entries. The class now only implments the required (as per express-session docs) methods and removes optional methods, except touch() which allows the TTL of an entry to be refreshed without affecting its LRU recency. There is no longer a way to stop the prune timer, but I don't belive the function was ever being called beforehand. The session store's lifetime is the same as the application's, and since it is unref()'d should not cause any shutdown issues.
Currently translated at 100.0% (872 of 872 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/es/
Currently translated at 99.8% (871 of 872 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/fr/
Currently translated at 99.8% (871 of 872 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/de/
Currently translated at 100.0% (974 of 974 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/sl/
Currently translated at 99.7% (972 of 974 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/es/
Currently translated at 100.0% (974 of 974 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/zh_Hans/
Currently translated at 100.0% (974 of 974 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/sl/
Currently translated at 100.0% (974 of 974 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/bn/
Currently translated at 100.0% (974 of 974 strings) Translation: Audiobookshelf/Abs Web Client Translate-URL: https://hosted.weblate.org/projects/audiobookshelf/abs-web-client/de/
beforeRouteLeave(to, from, next) { | ||
if (this.hasChanges) { | ||
next(false) | ||
window.location = to.path |
Check failure
Code scanning / CodeQL
Client-side cross-site scripting High
user-provided value
if (library.update({ displayOrder: orderdata[i].newOrder })) { | ||
hasUpdates = true | ||
await Database.updateLibrary(library) | ||
return res.status(400).send(`Library not found with id ${orderdata[i].id}`) |
Check failure
Code scanning / CodeQL
Reflected cross-site scripting High
user-provided value
Cross-site scripting vulnerability due to a
user-provided value
Well, looks like I merged the branches incorrectly. I will close this PR and open a new one in a few hours because I'll be making a lot of changes to how this is done anyway. |
This PR fixes #3207 by combining series that have identical names and the same
libraryId
. I originally only used the series name, but realized that the same series may exist in two different libraries.This PR does not fix the "duplicate entries" causing problems for the InputDropdown component, and instead fixes the data in the database itself.
This is done by adding a cleanup function to server startup which updates all
bookSeries
entries that have duplicate series names and libraryIds to point to the newest series, and then deletes the now-unused duplicates.This will help handle issues with series from previous server versions, but I am not sure of the root cause of duplicate series (whether from a scanner, matching, or manual editing of an item). I briefly looked at all occurrences of
SeriesModel
and did not see anything that looked like it could cause duplicate series like this, so it may be a leftover bug from an older server version.