Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting for concurrent conversions #6

Open
mattpurnell opened this issue Apr 23, 2022 · 3 comments
Open

Setting for concurrent conversions #6

mattpurnell opened this issue Apr 23, 2022 · 3 comments

Comments

@mattpurnell
Copy link

Hey - great tool, so easy to use!

I only have one issue - my server runs at full blast, 99% CPU while converting, and if I put too many through it shuts down. Maybe my cpu heatsink needs reseating, not sure.

Is there a setting to reduce the concurrent conversions? It's currently running 4 at a time, I would like to try reducing it to 2.

If there is a script that contains this as a variable, I could copy it to my host and use Docker to refer it to there - if so, what's the path?

Thanks for your contribution!
Matt

@9Mad-Max5
Copy link
Owner

9Mad-Max5 commented May 10, 2022

Hi @mattpurnell

sorry for taking so long to reply to your issue.

There are two options you have to better organize the load.
The best which I would recommend is directly changing the settings in the docker-compose.

deploy:
  resources:
    limits:
      cpus: '0.5'
      memory: 2G

Adding this with deploy on the same level as restart policy e.g. will lead to the following hard limits to the docker container itself.

CPU it is only allowed to allocate 50% of one CPU core and a maximum of 2 gigs of ram allocation.
This should keep some headroom, and it could be dynamically adjusted to your needs.

The other option which I wouldn't recommend is adjusting the following file directly inside your container.
https://github.com/9Mad-Max5/docker-m4b-tool/blob/main/auto-m4b-tool.sh

In line 56 where m4b-tool is called there is a parameter --jobs=4 you could adjust it to --jobs=2.

But there is no guarantee that it will persist a container update and if you reach your goal, as the conversion of two parallel streams could still allocate a lot of your CPU.

@mattpurnell
Copy link
Author

Excellent, I can expose the .sh file outside of the docker.

I tried limiting the Docker CPU but weirdly the process seems to run outside of the docker and continues to use all CPU.

If I update the docker then I can copy and edit the updated sh, no problem.

Thanks!

Oh, one more thing - a few audiobooks I composed from individual MP3s created a single m4b as expected, but with no chapters - are there extra controls I can use to ensure the MP3s are inserted as chapters?

@9Mad-Max5
Copy link
Owner

Mhm that's really weird, maybe I need to dig in to docker once more.
Because you observation is similar to another issue, that the process is allocating resources outside of the container.

But for my knowledge that's just not possible.

I think the econd part I didn't got correctly.
Usually the single mp3s represent the chapters inside the m4b audiobook.
If the source is just on huge MP3 m4b-tool unfortunately can't do much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants