-
Notifications
You must be signed in to change notification settings - Fork 173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
problem during authentication #41
Comments
This was fixed by adding a delay in between issues. I used time.sleep(10). |
Too early to close this. |
I tried the catch and retry approach and managed to import some more issues, until not a single request could come through any more. My guess is my IP is now flagged as spam. Has anyone ever tried to migrate a large (250) batch of issues to github.com and succeeded? |
Had the same issue. I wrote a new script for myself looking at this page: https://developer.github.com/v3/#rate-limiting I you have your script certificated(Settings > Applications > Personal access tokens ) then you can do 20 requests per minute, otherwise way less. Now I'm currently moving something like 450 issues, and it will take about 30 minutes(sleep every 5 seconds). |
New report: In case of error the script is useless, because there's noting which allows you to skip already inserted entries. I modified it to inplement such functionality(and of course skip creation of milestone and labels if there's something to skip) and it's working fine. This is a functionality which definitely have to bo included in the official script(like --skip ). Skipping entries(milestones and labels) does not anymore terminate the script with the "422 Unprocessable entry" error. EDIT: Using the script is also possible to close the issues and preserve the assignee(I did it). This is a functionality which can be useful too to add to this script. |
@kewinrausch Could you post your changes or even issue a pull request here? I'm experiencing the same problem and sleeping for 4 seconds between each request does not seem to be good enough. Seems like a conditional that would check the user's rate-limit and confirm it's >0 before proceeding would be better. Interested to see what you came up with. |
@brianwc I didn't fork this project, so I have only a local copy of the modifications. I renamed the .py main file as .png and posted it at the end of this answer; just save it on your pc and open it with a text editor(or rename the extension as .txt or .py). At the begin you'll find some global variables I used to set the wait time between each request, after how many issue stop, comment to skip(if the process stop while processing a issue comment), issues to skip(already done in the previous session) and issue+pull request displacement. The last variable is needed if you import the issues in a repo. which already contains other issues. Feel free to ask me for additional info, and perform some test before using it on the real repo. |
Any luck getting this to work? We have like 400 issues, I don't want to do it by hand. |
I already exported 400+ issues and pull requests in groups of about 100 per time. Use my gh-issue-import image (download and rename as a .py file) and read my previous post. |
I ended up doing it ~5 issues at a time. Took me like an hour and my business partner got like 400 emails, but it's done now |
They can unsubscribe to events during the import procedure, so no spam in that case. I had to do 100 and wait for 30 minutes, usually, to respect the limits. The only faster way is to build and register an application. |
If it was more than an occasional problem I probably would. |
Anyway the auth can be changed to use the |
I've just created a PR that contains the changes mentioned by @kewinrausch in #41 (comment). Here's a link to the file changes if anyone that stumbles on this thread is interested. |
While trying to migrate issues from a GitHub Enterprise to github.com, I run into
Weird enough, the milestones, labels, and the first 11 issues (with comments) are succesfully migrated, so I don't understand where the sudden 401 is coming from.
The text was updated successfully, but these errors were encountered: