-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Receiving Import error #16
Comments
The DART-ID conda environment (https://github.com/SlavovLab/DART-ID/blob/master/environment.yml) is set up to run Python 3.7.6. Is there a specific reason you need to run Python 3.9.7? |
Hi, thanks for your reply. I need that particular version to run other programs. In your description of the program it says that it runs on python >= 3.7. BR |
Thanks for letting me know about the description -- this program was released when python 3.7 was the latest and I was trying to communicate that it would work with any 3.7 version. I will update the description to be more explicit about this requirement. In the meantime, you should be able to use python virtualenvs (https://docs.python.org/3/library/venv.html) or conda environments (https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html) to run DART-ID in a separate python environment, so that you can still run your other programs on their other python versions. Let me know if you need and help with this |
Hi, I ran the program now in an virtual environment and it works now. I received an error while running my first evidence file: dart_id -c /home/lukas/Desktop/MS-Data/Lukas/mq-run_150223/combined/txt/config_annotated.yaml -o /home/lukas/Desktop/MS-Data/Lukas/mq-run_150223/combined/txt/output_dart_id I used your default config file and changed only the input files to one evidence file from my first mq run. Where do I define the n-experiment argument? Thanks for your help! BR |
Hi Lukas, The
More importantly however -- DART-ID is only able to infer latent retention times by using data from multiple LCMS runs. If you only provide one experiment, there is no statistical power to be gained. i.e., if there is a low-confidence peptide in run A, we can increase confidence in our observation in run A if we see the same peptide at the same RT in run B (and ideally, in runs C, ..., N -- the more experiments we use, the more power we have). I would strongly recommend not using DART-ID if you only have one run, and to only use this tool if you have multiple (and ideally many) similarly configured LCMS runs. If you have any more questions let me know |
Hi Albert, thank you for assistance. My data is acquired from single cell monocytes. I think it would be a good idea to align retention times and include your program in my workflow. Although it is my 4th week in proteomics and I have only acquired 1 run successfully with MaxQuant. Hence the program should work, but with no improvements of PSM scores right? File "/home/lukas/anaconda3/envs/dart_env/lib/python3.7/shutil.py", line 104, in copyfile I appreciate your help. BR |
Hi Albert, I have one additional question: BR |
Do not run this program with just one run -- there is no improvement to be gained and the code relies on multiple experiments (and PSMs existing across
There are no constraints to the chemistry of the labeling or LC -- the liquid chromatography just has to be consistent. In our paper we use DART-ID in TMT-labeled and label-free runs. However, do not mix runs of different chemistries/chromatographies or even runs that are far apart (and thus not reproducible). For example, do not mix label-free and TMT-labelled runs -- the TMT labeling itself is chemically modifying peptides and altering their retention times (which DART-ID assumes to be so consistent that each run only requires a small linear adjustment to hit the "true" retention time). Hope this helps |
Hi,
An error is returned when running the program:
ImportError: cannot import name 'gcd' from 'fractions' (/home/lukas/anaconda3/lib/python3.9/fractions.py)
As per stack overflow the problem is caused by the networkx module which import statements changed upon python updates (I am using python 3.9.7).
BR
Lukas
The text was updated successfully, but these errors were encountered: