You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i will try to explain the issue as clear as much as i can. So after the new restriction implemented by google on files i experiencing this problem. most of the time when i clone my cpu usage is 10-20% but after this new 2tb rule if i try to clone a file which has the 2tb limit used, it stucks in loop of trying to clone the files with different sa accounts over and over again. cpu goes up to 100% and bot get stuck or crashs.
so is there a way we can limit the trying to clone to certain limit of sa accounts like 10-20 after that it should show error and not go in loop of trying with all 100 sa accounts mulitple time.
Right now it thinks that sa account has reached the limit but that not the cast the file reaches the limit not the sa accounts. So there no benefits in trying with diffrent sa accounts.
Thank you for making and maintaining this amazing project.
The text was updated successfully, but these errors were encountered:
i will try to explain the issue as clear as much as i can. So after the new restriction implemented by google on files i experiencing this problem. most of the time when i clone my cpu usage is 10-20% but after this new 2tb rule if i try to clone a file which has the 2tb limit used, it stucks in loop of trying to clone the files with different sa accounts over and over again. cpu goes up to 100% and bot get stuck or crashs.
so is there a way we can limit the trying to clone to certain limit of sa accounts like 10-20 after that it should show error and not go in loop of trying with all 100 sa accounts mulitple time.
Right now it thinks that sa account has reached the limit but that not the cast the file reaches the limit not the sa accounts. So there no benefits in trying with diffrent sa accounts.
Thank you for making and maintaining this amazing project.
The text was updated successfully, but these errors were encountered: