You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment, semi-enzymatic is kinda unusable unless you have a ton of RAM. Fragment indexing necessitates pre-digesting every peptide and generating every fragment... which is pretty resource intensive for semi-enzymatic (or no-enzyme).
I am working on an internal database splitting solution to at least partially alleviate the problem (and hopefully improve it over time). In the mean time, you can confirm that it works by either reducing the # of missed cleavages, or using a significantly smaller FASTA file (or processing FASTA database in chunks: #97 (comment))
Seems like semi-enzymatic digests are not completing?
At least that is what happens if I try the following settings:
If I remove the last line or replace it with
"semi_enzymatic": null
the search completes and is as fast as before.I'm using the default SearchGUI/PeptideShaker example dataset/input, hence there should be no unexpected issues there.
Any idea what is happening?
The text was updated successfully, but these errors were encountered: