Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change test to make it fail #5

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

change test to make it fail #5

wants to merge 1 commit into from

Conversation

lldelisle
Copy link
Owner

No description provided.

Copy link

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 2
Passed 1
Error 0
Failure 1
Skipped 0
Failed Tests
  • ❌ parallel-accession-download.ga_0

    Problems:

    • Output with path /tmp/tmpquzmltfa/SRR044777__1f7bfb65-f951-4191-ad52-4786ba81c88c different than expected, difference (using contains):
      ( /home/runner/work/iwc/iwc/workflows/data-fetching/parallel-accession-download/test-data/SRR044777_head.fastq v. /tmp/tmpknh59paySRR044777_head.fastq )
      Failed to find 'b'@F47USSH02H1LGA/4'' in history data. (lines_diff=0).
      

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: Run accessions:

        • step_state: scheduled
      • Step 2: Split accessions to collection:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • mkdir ./out && python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/split_file_to_collection/baabc30154cd/split_file_to_collection/split_file_to_collection.py' --out ./out --in '/tmp/tmp8_14wfun/files/c/b/f/dataset_cbfa874e-8e4e-4102-b65f-e29063a8faae.dat' --ftype 'txt' --chunksize 1 --file_names 'split_file' --file_ext 'txt'

            Exit Code:

            • 0

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "txt"
              __workflow_invocation_uuid__ "52942787b0a211ee9f24237fad8a8bfc"
              chromInfo "/tmp/tmp8_14wfun/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              split_parms {"__current_case__": 5, "input": {"values": [{"id": 1, "src": "hda"}]}, "newfilenames": "split_file", "select_allocate": {"__current_case__": 2, "allocate": "byrow"}, "select_ftype": "txt", "select_mode": {"__current_case__": 0, "chunksize": "1", "mode": "chunk"}}
      • Step 3: to parameter:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "txt"
              __workflow_invocation_uuid__ "52942787b0a211ee9f24237fad8a8bfc"
              chromInfo "/tmp/tmp8_14wfun/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 4: fasterq-dump:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • set -o | grep -q pipefail && set -o pipefail;  mkdir -p ~/.ncbi && cp '/tmp/tmp8_14wfun/job_working_directory/000/4/configs/tmpbfj_9r3d' ~/.ncbi/user-settings.mkfg &&   export SRA_PREFETCH_RETRIES=3 && export SRA_PREFETCH_ATTEMPT=1 &&   echo 'SRR044777' | sed -r 's/(\,|\;|__cn__)/\n/g' > accessions && for acc in $(cat ./accessions); do ( echo "Downloading accession: $acc..." &&  while [ $SRA_PREFETCH_ATTEMPT -le $SRA_PREFETCH_RETRIES ] ; do fasterq-dump "$acc" -e ${GALAXY_SLOTS:-1} --seq-defline '@$sn/$ri' --qual-defline '+' --split-3 --skip-technical 2>&1 | tee -a '/tmp/tmp8_14wfun/job_working_directory/000/4/outputs/dataset_3f7a3a89-9f7f-4d76-a80c-79709e5e14e6.dat'; if [ $? == 0 ] && [ $(ls *.fastq | wc -l) -ge 1 ]; then break ; else echo "Prefetch attempt $SRA_PREFETCH_ATTEMPT of $SRA_PREFETCH_RETRIES exited with code $?" ; SRA_PREFETCH_ATTEMPT=`expr $SRA_PREFETCH_ATTEMPT + 1` ; sleep 1 ; fi ; done && mkdir -p output && mkdir -p outputOther && count="$(ls *.fastq | wc -l)" && echo "There are $count fastq files" && data=($(ls *.fastq)) && if [ "$count" -eq 1 ]; then pigz -cqp ${GALAXY_SLOTS:-1} "${data[0]}" > output/"${acc}"__single.fastqsanger.gz && rm "${data[0]}"; elif [ "--split-3" = "--split-3" ]; then if [ -e "${acc}".fastq ]; then pigz -cqp ${GALAXY_SLOTS:-1} "${acc}".fastq > outputOther/"${acc}"__single.fastqsanger.gz; fi && pigz -cqp ${GALAXY_SLOTS:-1} "${acc}"_1.fastq > output/"${acc}"_forward.fastqsanger.gz && pigz -cqp ${GALAXY_SLOTS:-1} "${acc}"_2.fastq > output/"${acc}"_reverse.fastqsanger.gz && rm "${acc}"*.fastq; elif [ "$count" -eq 2 ]; then pigz -cqp ${GALAXY_SLOTS:-1} "${data[0]}" > output/"${acc}"_forward.fastqsanger.gz && pigz -cqp ${GALAXY_SLOTS:-1} "${data[1]}" > output/"${acc}"_reverse.fastqsanger.gz && rm "${data[0]}" && rm "${data[1]}"; else for file in ${data[*]}; do pigz -cqp ${GALAXY_SLOTS:-1} "$file" > outputOther/"$file"sanger.gz && rm "$file"; done; fi;  ); done; echo "Done with all accessions."

            Exit Code:

            • 0

            Standard Output:

            • Downloading accession: SRR044777...
              spots read      : 7,882
              reads read      : 31,528
              reads written   : 7,882
              technical reads : 23,646
              There are 1 fastq files
              Done with all accessions.
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __job_resource {"__current_case__": 0, "__job_resource__select": "no"}
              __workflow_invocation_uuid__ "52942787b0a211ee9f24237fad8a8bfc"
              adv {"minlen": null, "seq_defline": "@$sn/$ri", "skip_technical": true, "split": "--split-3"}
              chromInfo "/tmp/tmp8_14wfun/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input {"__current_case__": 0, "accession": "SRR044777", "input_select": "accession_number"}
      • Step 5: flatten paired output:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "52942787b0a211ee9f24237fad8a8bfc"
              input {"values": [{"id": 3, "src": "hdca"}]}
              rules {"mapping": [{"collapsible_value": {"__class__": "RuntimeValue"}, "columns": [1], "connectable": true, "editing": false, "is_workflow": false, "type": "list_identifiers"}, {"collapsible_value": {"__class__": "RuntimeValue"}, "columns": [2], "connectable": true, "is_workflow": false, "type": "paired_identifier"}], "rules": [{"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier0", "warn": null}, {"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier1", "warn": null}, {"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier2", "warn": null}]}
      • Step 6: flatten single end output:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "52942787b0a211ee9f24237fad8a8bfc"
              input {"values": [{"id": 4, "src": "hdca"}]}
              rules {"mapping": [{"collapsible_value": {"__class__": "RuntimeValue"}, "columns": [1], "connectable": true, "editing": false, "is_workflow": false, "type": "list_identifiers"}], "rules": [{"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier0", "warn": null}, {"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier1", "warn": null}]}
    • Other invocation details
      • history_id

        • 1239dd3e295ce541
      • history_state

        • ok
      • invocation_id

        • 1239dd3e295ce541
      • invocation_state

        • scheduled
      • workflow_id

        • 642fb8db1d7b0e74
Passed Tests
  • ✅ parallel-accession-download.ga_1

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: Run accessions:

        • step_state: scheduled
      • Step 2: Split accessions to collection:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • mkdir ./out && python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/bgruening/split_file_to_collection/baabc30154cd/split_file_to_collection/split_file_to_collection.py' --out ./out --in '/tmp/tmp8_14wfun/files/1/8/7/dataset_187b6885-da55-4529-9e61-0c05dd01acd6.dat' --ftype 'txt' --chunksize 1 --file_names 'split_file' --file_ext 'txt'

            Exit Code:

            • 0

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "txt"
              __workflow_invocation_uuid__ "7210c601b0a211ee9f24237fad8a8bfc"
              chromInfo "/tmp/tmp8_14wfun/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              split_parms {"__current_case__": 5, "input": {"values": [{"id": 7, "src": "hda"}]}, "newfilenames": "split_file", "select_allocate": {"__current_case__": 2, "allocate": "byrow"}, "select_ftype": "txt", "select_mode": {"__current_case__": 0, "chunksize": "1", "mode": "chunk"}}
      • Step 3: to parameter:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "txt"
              __workflow_invocation_uuid__ "7210c601b0a211ee9f24237fad8a8bfc"
              chromInfo "/tmp/tmp8_14wfun/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              param_type "text"
              remove_newlines true
      • Step 4: fasterq-dump:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • set -o | grep -q pipefail && set -o pipefail;  mkdir -p ~/.ncbi && cp '/tmp/tmp8_14wfun/job_working_directory/000/10/configs/tmpqjn48nuw' ~/.ncbi/user-settings.mkfg &&   export SRA_PREFETCH_RETRIES=3 && export SRA_PREFETCH_ATTEMPT=1 &&   echo 'SRR11953971' | sed -r 's/(\,|\;|__cn__)/\n/g' > accessions && for acc in $(cat ./accessions); do ( echo "Downloading accession: $acc..." &&  while [ $SRA_PREFETCH_ATTEMPT -le $SRA_PREFETCH_RETRIES ] ; do fasterq-dump "$acc" -e ${GALAXY_SLOTS:-1} --seq-defline '@$sn/$ri' --qual-defline '+' --split-3 --skip-technical 2>&1 | tee -a '/tmp/tmp8_14wfun/job_working_directory/000/10/outputs/dataset_110f93be-8ebd-4ec1-ac74-359df98c9209.dat'; if [ $? == 0 ] && [ $(ls *.fastq | wc -l) -ge 1 ]; then break ; else echo "Prefetch attempt $SRA_PREFETCH_ATTEMPT of $SRA_PREFETCH_RETRIES exited with code $?" ; SRA_PREFETCH_ATTEMPT=`expr $SRA_PREFETCH_ATTEMPT + 1` ; sleep 1 ; fi ; done && mkdir -p output && mkdir -p outputOther && count="$(ls *.fastq | wc -l)" && echo "There are $count fastq files" && data=($(ls *.fastq)) && if [ "$count" -eq 1 ]; then pigz -cqp ${GALAXY_SLOTS:-1} "${data[0]}" > output/"${acc}"__single.fastqsanger.gz && rm "${data[0]}"; elif [ "--split-3" = "--split-3" ]; then if [ -e "${acc}".fastq ]; then pigz -cqp ${GALAXY_SLOTS:-1} "${acc}".fastq > outputOther/"${acc}"__single.fastqsanger.gz; fi && pigz -cqp ${GALAXY_SLOTS:-1} "${acc}"_1.fastq > output/"${acc}"_forward.fastqsanger.gz && pigz -cqp ${GALAXY_SLOTS:-1} "${acc}"_2.fastq > output/"${acc}"_reverse.fastqsanger.gz && rm "${acc}"*.fastq; elif [ "$count" -eq 2 ]; then pigz -cqp ${GALAXY_SLOTS:-1} "${data[0]}" > output/"${acc}"_forward.fastqsanger.gz && pigz -cqp ${GALAXY_SLOTS:-1} "${data[1]}" > output/"${acc}"_reverse.fastqsanger.gz && rm "${data[0]}" && rm "${data[1]}"; else for file in ${data[*]}; do pigz -cqp ${GALAXY_SLOTS:-1} "$file" > outputOther/"$file"sanger.gz && rm "$file"; done; fi;  ); done; echo "Done with all accessions."

            Exit Code:

            • 0

            Standard Output:

            • Downloading accession: SRR11953971...
              spots read      : 2,057
              reads read      : 4,114
              reads written   : 4,114
              There are 2 fastq files
              Done with all accessions.
              

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __job_resource {"__current_case__": 0, "__job_resource__select": "no"}
              __workflow_invocation_uuid__ "7210c601b0a211ee9f24237fad8a8bfc"
              adv {"minlen": null, "seq_defline": "@$sn/$ri", "skip_technical": true, "split": "--split-3"}
              chromInfo "/tmp/tmp8_14wfun/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              input {"__current_case__": 0, "accession": "SRR11953971", "input_select": "accession_number"}
      • Step 5: flatten paired output:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "7210c601b0a211ee9f24237fad8a8bfc"
              input {"values": [{"id": 11, "src": "hdca"}]}
              rules {"mapping": [{"collapsible_value": {"__class__": "RuntimeValue"}, "columns": [1], "connectable": true, "editing": false, "is_workflow": false, "type": "list_identifiers"}, {"collapsible_value": {"__class__": "RuntimeValue"}, "columns": [2], "connectable": true, "is_workflow": false, "type": "paired_identifier"}], "rules": [{"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier0", "warn": null}, {"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier1", "warn": null}, {"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier2", "warn": null}]}
      • Step 6: flatten single end output:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Job Parameters:

            • Job parameter Parameter value
              __workflow_invocation_uuid__ "7210c601b0a211ee9f24237fad8a8bfc"
              input {"values": [{"id": 12, "src": "hdca"}]}
              rules {"mapping": [{"collapsible_value": {"__class__": "RuntimeValue"}, "columns": [1], "connectable": true, "editing": false, "is_workflow": false, "type": "list_identifiers"}], "rules": [{"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier0", "warn": null}, {"collapsible_value": {"__class__": "RuntimeValue"}, "connectable": true, "error": null, "is_workflow": false, "type": "add_column_metadata", "value": "identifier1", "warn": null}]}
    • Other invocation details
      • history_id

        • 642fb8db1d7b0e74
      • history_state

        • ok
      • invocation_id

        • 642fb8db1d7b0e74
      • invocation_state

        • scheduled
      • workflow_id

        • 642fb8db1d7b0e74

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant