You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I get an "FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory" error trying to migrate a mysql table to mongodb. I tried with another table and I got the same error. In the first case after about 79k rows processed, and in the second after about 140k. I mention that I have nothing fancy in the config.json - just field to field migration, no data modifiers.
If I run it with the --trace-gc flag I get "allocation failure" pretty fast.
Please fix this because it's pretty unusable if it crashes after 79k rows. I need it to work for a few dozen million rows so deleting processed rows and restarting the script is not a viable option.
PS: I run on winxp 32 bit. I tried today on win7 x64 and it stops at record 53, but not from the memory leak. I think it is a queue problem as it only queues 53 records (another table only queues 17 records):
......
queue.length = 52
queue.length = 53
and then it starts importing and stops after 53 rows.
Another thing is that I think the memory leak is related to mongo library because if I do a dry run it gets past 79k. I initially thought it is from the mysql library (I read mysqljs/mysql#471) but I updated to 2.0 alpha 7 (you used alpha 3) and I still have the same issue.
The text was updated successfully, but these errors were encountered:
I get an "FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory" error trying to migrate a mysql table to mongodb. I tried with another table and I got the same error. In the first case after about 79k rows processed, and in the second after about 140k. I mention that I have nothing fancy in the config.json - just field to field migration, no data modifiers.
If I run it with the --trace-gc flag I get "allocation failure" pretty fast.
Please fix this because it's pretty unusable if it crashes after 79k rows. I need it to work for a few dozen million rows so deleting processed rows and restarting the script is not a viable option.
PS: I run on winxp 32 bit. I tried today on win7 x64 and it stops at record 53, but not from the memory leak. I think it is a queue problem as it only queues 53 records (another table only queues 17 records):
......
queue.length = 52
queue.length = 53
and then it starts importing and stops after 53 rows.
Another thing is that I think the memory leak is related to mongo library because if I do a dry run it gets past 79k. I initially thought it is from the mysql library (I read mysqljs/mysql#471) but I updated to 2.0 alpha 7 (you used alpha 3) and I still have the same issue.
The text was updated successfully, but these errors were encountered: