You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
I have a problem when using sage with a relative large search fasta file. I can succefully run the test example and other fasta database with less than 20000 sequences on my MS data. But when I run on large fasta file, the command give me the "Segmentation fault (core dumped)" without any other information. It could happen on both my local and HPC with 200 GB memory.
The text was updated successfully, but these errors were encountered:
You'll need to either use a computer with more RAM, reduce search space (less var mods, no semi-enzymatic), or reduce the size of your FASTA (e.g. search in multiple chunks and then recombine - see #97 for a mitigation)
Hi!
I have a problem when using sage with a relative large search fasta file. I can succefully run the test example and other fasta database with less than 20000 sequences on my MS data. But when I run on large fasta file, the command give me the "Segmentation fault (core dumped)" without any other information. It could happen on both my local and HPC with 200 GB memory.
The text was updated successfully, but these errors were encountered: