You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to analyze 3 samples, but ATLAS crashes on the following error:
Building DAG of jobs...
Creating conda environment /media/archaea/UBUNTU-HD/miniconda3/envs/atlasenv/lib/python3.8/site-packages/atlas/workflow/rules/../envs/assembly.yaml...
Downloading and installing remote packages.
Environment for ../../miniconda3/envs/atlasenv/lib/python3.8/site-packages/atlas/workflow/envs/assembly.yaml created (location: ../atlas_database/conda_envs/a1293008546da0884a56f95a1ffd71c1)
Using shell: /usr/bin/bash
Provided cores: 2
Rules claiming more threads will be scaled down.
Singularity containers: ignored
Job stats:
job count min threads max threads
Activating conda environment: /media/archaea/UBUNTU-HD/doutorado/atlas_database/conda_envs/a1293008546da0884a56f95a1ffd71c1
[Tue Nov 9 17:25:45 2021]
Error in rule run_spades:
jobid: 84
output: tratada/assembly/contigs.fasta, tratada/assembly/scaffolds.fasta
log: tratada/logs/assembly/spades.log (check log file(s) for error message)
conda-env: /media/archaea/UBUNTU-HD/doutorado/atlas_database/conda_envs/a1293008546da0884a56f95a1ffd71c1
shell:
rm -f tratada/assembly/pipeline_state/stage_*_copy_files 2> tratada/logs/assembly/spades.log ; spades.py --threads 2 --memory 16 -o tratada/assembly -k 21,33,55,77,99,127 --restart-from last >> tratada/logs/assembly/spades.log 2>&1
(one of the commands exited with non-zero exit code; note that snakemake uses bash strict mode!)
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /media/archaea/UBUNTU-HD/doutorado/atlas/.snakemake/log/2021-11-09T172506.723742.snakemake.log
The text was updated successfully, but these errors were encountered:
Hi,
I am trying to analyze 3 samples, but ATLAS crashes on the following error:
Building DAG of jobs...
Creating conda environment /media/archaea/UBUNTU-HD/miniconda3/envs/atlasenv/lib/python3.8/site-packages/atlas/workflow/rules/../envs/assembly.yaml...
Downloading and installing remote packages.
Environment for ../../miniconda3/envs/atlasenv/lib/python3.8/site-packages/atlas/workflow/envs/assembly.yaml created (location: ../atlas_database/conda_envs/a1293008546da0884a56f95a1ffd71c1)
Using shell: /usr/bin/bash
Provided cores: 2
Rules claiming more threads will be scaled down.
Singularity containers: ignored
Job stats:
job count min threads max threads
DRAM_destill 1 1 1
DRAM_set_db_loc 1 1 1
align 1 2 2
align_reads_to_Genecatalog 3 2 2
align_reads_to_MAGs 3 2 2
align_reads_to_final_contigs 3 2 2
all 1 1 1
all_contigs2bins 1 1 1
all_gtdb_trees 1 1 1
all_prodigal 1 1 1
assembly 1 1 1
assembly_one_sample 3 1 1
bam_2_sam_contigs 3 2 2
binning 1 1 1
build_assembly_report 1 1 1
build_bin_report 1 1 1
build_db_genomes 1 2 2
build_qc_report 1 1 1
calculate_contigs_stats 6 1 1
calculate_insert_size 3 2 2
classify 1 2 2
cluster_genes 1 2 2
combine_bin_stats 1 1 1
combine_bined_coverages_MAGs 1 1 1
combine_contig_stats 1 1 1
combine_coverages_MAGs 1 1 1
combine_egg_nogg_annotations 1 1 1
combine_gene_coverages 1 1 1
combine_insert_stats 1 1 1
combine_read_counts 1 1 1
combine_read_length_stats 1 1 1
combine_taxonomy 1 1 1
concat_annotations 1 1 1
concat_genes 1 1 1
convert_sam_to_bam 6 2 2
dereplication 1 2 2
do_not_filter_contigs 3 1 1
dram_download 1 2 2
filter_genes 1 1 1
finalize_contigs 3 1 1
finalize_sample_qc 3 1 1
gene2genome 1 1 1
gene_subsets 1 1 1
genecatalog 1 1 1
genomes 1 1 1
get_all_bins 1 1 1
get_all_modules 1 1 1
get_bins 3 1 1
get_contigs_from_gene_names 3 1 1
get_metabat_depth_file 3 2 2
get_quality_for_dRep_from_checkm 1 1 1
get_unique_cluster_attribution 3 1 1
identify 1 2 2
merge_checkm 1 1 1
merge_pairs 1 2 2
metabat 3 2 2
parse_clstr_files 1 1 1
pileup 3 2 2
pileup_Genecatalog 3 2 2
pileup_MAGs 3 2 2
predict_genes 3 1 1
qc 1 1 1
rename_contigs 3 2 2
rename_gene_catalog 1 1 1
rename_gene_clusters 1 1 1
rename_genomes 1 1 1
rename_spades_output 3 1 1
run_all_checkm_lineage_wf 1 2 2
run_checkm_lineage_wf 3 2 2
run_checkm_tree_qa 3 1 1
run_spades 3 2 2
write_read_counts 1 1 1
total 128 1 2
[Tue Nov 9 17:25:45 2021]
rule run_spades:
input: tratada/assembly/reads/QC.errorcorr.merged_R1.fastq.gz, tratada/assembly/reads/QC.errorcorr.merged_R2.fastq.gz, tratada/assembly/reads/QC.errorcorr.merged_me.fastq.gz
output: tratada/assembly/contigs.fasta, tratada/assembly/scaffolds.fasta
log: tratada/logs/assembly/spades.log
jobid: 84
benchmark: logs/benchmarks/assembly/spades/tratada.txt
wildcards: sample=tratada
threads: 2
resources: tmpdir=/tmp, mem=16, time=48, mem_mb=16000, time_min=2880
Activating conda environment: /media/archaea/UBUNTU-HD/doutorado/atlas_database/conda_envs/a1293008546da0884a56f95a1ffd71c1
[Tue Nov 9 17:25:45 2021]
Error in rule run_spades:
jobid: 84
output: tratada/assembly/contigs.fasta, tratada/assembly/scaffolds.fasta
log: tratada/logs/assembly/spades.log (check log file(s) for error message)
conda-env: /media/archaea/UBUNTU-HD/doutorado/atlas_database/conda_envs/a1293008546da0884a56f95a1ffd71c1
shell:
rm -f tratada/assembly/pipeline_state/stage_*_copy_files 2> tratada/logs/assembly/spades.log ; spades.py --threads 2 --memory 16 -o tratada/assembly -k 21,33,55,77,99,127 --restart-from last >> tratada/logs/assembly/spades.log 2>&1
(one of the commands exited with non-zero exit code; note that snakemake uses bash strict mode!)
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /media/archaea/UBUNTU-HD/doutorado/atlas/.snakemake/log/2021-11-09T172506.723742.snakemake.log
The text was updated successfully, but these errors were encountered: