You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Aviary v0.6.0: Getting lots of Aborted (core dumped) and Segmentation fault (core dumped) in the vamb_jgi_filter rule. I think this happens when coverm.cov is large and can't fit in the seemingly random resources assigned by snakemake (e.g. the one below gets 1000MB memory but others get 10648MB). Any ideas?
[Mon Jun 26 13:15:04 2023]
rule vamb_jgi_filter:
input: /mnt/hpccs01/work/microbiome/msingle/sam/projects/23-SRA-coassembly-r214/results/aviary/target/20230619/p__DRYD01/coassemble/coassemble/coassembly_2/assemble/assembly/final_contigs.fasta, data/coverm.cov
output: data/coverm.filt.cov
jobid: 23
reason: Missing output files: data/coverm.filt.cov; Input files updated by another job: data/coverm.cov
threads: 64
resources: mem_mb=1000, disk_mb=1000, tmpdir=/data1/tmp
/mnt/hpccs01/work/microbiome/conda/envs/aviary-v0.6.0/lib/python3.10/site-packages/google/protobuf/internal/api_implementation.py:110: UserWarning: Selected implementation cpp is not available.
warnings.warn(
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 64
Rules claiming more threads will be scaled down.
Provided resources: mem_mb=1000, disk_mb=1000
Select jobs to execute...
thread panicked while processing panic. aborting.
/bin/sh: line 1: 87862 Aborted (core dumped) /mnt/hpccs01/work/microbiome/conda/envs/aviary-v0.6.0/bin/python3.10 -m snakemake --snakefile '/mnt/hpccs01/work/microbiome/sw/aviary_repos/aviary-v0.6.0/aviary/aviary/modules/Snakefile' 'data/coverm.filt.cov' --allowed-rules 'vamb_jgi_filter' --cores 64 --attempt 1 --force-use-threads --resources 'mem_mb=1000' 'disk_mb=1000' --quiet --force --keep-target-files --keep-remote --max-inventory-time 0 --nocolor --notemp --no-hooks --nolock --ignore-incomplete --rerun-triggers 'code' 'software-env' 'input' 'params' 'mtime' --skip-script-cleanup --use-conda --conda-frontend 'mamba' --conda-prefix '/mnt/hpccs01/work/microbiome/conda' --conda-base-path '/mnt/hpccs01/work/microbiome/conda/envs/aviary-v0.6.0' --wrapper-prefix 'https://github.com/snakemake/snakemake-wrappers/raw/' --local-groupid 'local' --configfiles '/mnt/hpccs01/work/microbiome/msingle/sam/projects/23-SRA-coassembly-r214/results/aviary/target/20230619/p__DRYD01/coassemble/coassemble/coassembly_2/recover/config.yaml' --latency-wait 5 --scheduler 'ilp' --scheduler-solver-path '/mnt/hpccs01/work/microbiome/conda/envs/aviary-v0.6.0/bin' --default-resources 'mem_mb=max(2*input.size_mb, 1000)' 'disk_mb=max(2*input.size_mb, 1000)' "tmpdir='/data1/tmp'" --directory '/mnt/hpccs01/work/microbiome/msingle/sam/projects/23-SRA-coassembly-r214/results/aviary/target/20230619/p__DRYD01/coassemble/coassemble/coassembly_2/recover' --mode 1
Removing output files of failed job vamb_jgi_filter since they might be corrupted:
data/coverm.filt.cov
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2023-06-26T101113.123452.snakemake.log
An error occurred
06/26/2023 01:16:42 PM CRITICAL: Command 'snakemake --snakefile /mnt/hpccs01/work/microbiome/sw/aviary_repos/aviary-v0.6.0/aviary/aviary/modules/Snakefile --directory /mnt/hpccs01/work/microbiome/msingle/sam/projects/23-SRA-coassembly-r214/results/aviary/target/20230619/p__DRYD01/coassemble/coassemble/coassembly_2/recover --jobs 64 --rerun-incomplete --configfile '/mnt/hpccs01/work/microbiome/msingle/sam/projects/23-SRA-coassembly-r214/results/aviary/target/20230619/p__DRYD01/coassemble/coassemble/coassembly_2/recover/config.yaml' --nolock --conda-frontend mamba --default-resources "tmpdir='/data1/tmp'" --resources mem_mb=512000 --use-conda --conda-prefix /mnt/hpccs01/work/microbiome/conda recover_mags' returned non-zero exit status 1.
The text was updated successfully, but these errors were encountered:
Aviary v0.6.0: Getting lots of
Aborted (core dumped)
andSegmentation fault (core dumped)
in the vamb_jgi_filter rule. I think this happens whencoverm.cov
is large and can't fit in the seemingly random resources assigned by snakemake (e.g. the one below gets 1000MB memory but others get 10648MB). Any ideas?The text was updated successfully, but these errors were encountered: