Skip to content

Quantized training fails when a model is too complex #3881

Quantized training fails when a model is too complex

Quantized training fails when a model is too complex #3881

Workflow file for this run

name: No Response Bot
on:
issue_comment:
types: [created]
schedule:
# "every day at 04:00 UTC"
- cron: '0 4 * * *'
jobs:
noResponse:
runs-on: ubuntu-22.04
steps:
- uses: lee-dohm/[email protected]
with:
closeComment: >
This issue has been automatically closed because it has been awaiting a response for too long.
When you have time to to work with the maintainers to resolve this issue, please post a new comment and it will be re-opened.
If the issue has been locked for editing by the time you return to it, please open a new issue and reference this one.
Thank you for taking the time to improve LightGBM!
daysUntilClose: 30
responseRequiredLabel: awaiting response
token: ${{ github.token }}