Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.1.3 Patch release #426

Merged
merged 26 commits into from
Jan 12, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
39bad24
Bump version after release
jfy133 Oct 27, 2023
ad8ff5b
Merge pull request #413 from nf-core/bump-version-postrelease
jfy133 Oct 27, 2023
5eb7ef2
add csv syntax highlighting
mashehu Nov 15, 2023
a77d047
Add csv in syntax highlighting in other plages
jfy133 Nov 16, 2023
dbc2478
Merge pull request #419 from nf-core/add-csv-syntax
mashehu Nov 16, 2023
20e7d64
chore: update krakenuniq/preloadedkrakenuniq module
Midnighter Nov 22, 2023
cc3bdea
docs: describe changes in krakenuniq output
Midnighter Nov 22, 2023
c8cb7c1
docs: make a changelog entry
Midnighter Nov 22, 2023
8f71053
fix: adjust publishing pattern to allow FASTA
Midnighter Nov 23, 2023
bcb0539
fix: adjust schema to correct FASTA output
Midnighter Nov 23, 2023
aab50a5
Merge pull request #421 from nf-core/fix-krakenuniq
Midnighter Nov 23, 2023
0fa1d5f
Template update for nf-core/tools version 2.11
nf-core-bot Dec 19, 2023
933f69c
Template update for nf-core/tools version 2.11.1
nf-core-bot Dec 20, 2023
ac36299
Merge branch 'dev' into nf-core-template-merge-2.11.1
sofstam Jan 11, 2024
1733450
Update multiqc module
sofstam Jan 11, 2024
65eeec0
Update adapterremoval
sofstam Jan 11, 2024
a3bf514
Update rest of modules failing to lint
sofstam Jan 11, 2024
5bdcd16
Merge pull request #424 from nf-core/nf-core-template-merge-2.11.1
jfy133 Jan 12, 2024
f813f9f
Fix linting
jfy133 Jan 12, 2024
7fd8679
Bump version for release
jfy133 Jan 12, 2024
fbb93ab
Update CHANGELOG.md
jfy133 Jan 12, 2024
bc966fa
Update CHANGELOG.md
jfy133 Jan 12, 2024
7055f0d
Add missing year
jfy133 Jan 12, 2024
9f0445b
Merge branch 'patch-release-dump' of github.com:nf-core/taxprofiler i…
jfy133 Jan 12, 2024
8959ae7
Update CHANGELOG.md
jfy133 Jan 12, 2024
2fc3c02
Merge pull request #427 from nf-core/patch-release-dump
jfy133 Jan 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,9 @@ If you're not used to this workflow with git, you can start with some [docs from

## Tests

You can optionally test your changes by running the pipeline locally. Then it is recommended to use the `debug` profile to
receive warnings about process selectors and other debug info. Example: `nextflow run . -profile debug,test,docker --outdir <OUTDIR>`.

When you create a pull request with changes, [GitHub Actions](https://github.com/features/actions) will run automatic tests.
Typically, pull-requests are only fully reviewed when these tests are passing, though of course we can help out before then.

Expand Down
1 change: 1 addition & 0 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ Learn more about contributing: [CONTRIBUTING.md](https://github.com/nf-core/taxp
- [ ] If necessary, also make a PR on the nf-core/taxprofiler _branch_ on the [nf-core/test-datasets](https://github.com/nf-core/test-datasets) repository.
- [ ] Make sure your code lints (`nf-core lint`).
- [ ] Ensure the test suite passes (`nextflow run . -profile test,docker --outdir <OUTDIR>`).
- [ ] Check for unexpected warnings in debug mode (`nextflow run . -profile debug,test,docker --outdir <OUTDIR>`).
- [ ] Usage Documentation in `docs/usage.md` is updated.
- [ ] Output Documentation in `docs/output.md` is updated.
- [ ] `CHANGELOG.md` is updated.
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ jobs:

steps:
- name: Check out pipeline code
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Install Nextflow
uses: nf-core/setup-nextflow@v1
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/fix-linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
runs-on: ubuntu-latest
steps:
# Use the @nf-core-bot token to check out so we can push later
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
token: ${{ secrets.nf_core_bot_auth_token }}

Expand All @@ -24,7 +24,7 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.nf_core_bot_auth_token }}

- uses: actions/setup-node@v3
- uses: actions/setup-node@v4

- name: Install Prettier
run: npm install -g prettier @prettier/plugin-php
Expand Down
12 changes: 6 additions & 6 deletions .github/workflows/linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ jobs:
EditorConfig:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- uses: actions/setup-node@v3
- uses: actions/setup-node@v4

- name: Install editorconfig-checker
run: npm install -g editorconfig-checker
Expand All @@ -27,9 +27,9 @@ jobs:
Prettier:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- uses: actions/setup-node@v3
- uses: actions/setup-node@v4

- name: Install Prettier
run: npm install -g prettier
Expand All @@ -40,7 +40,7 @@ jobs:
PythonBlack:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Check code lints with Black
uses: psf/black@stable
Expand Down Expand Up @@ -71,7 +71,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out pipeline code
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Install Nextflow
uses: nf-core/setup-nextflow@v1
Expand Down
4 changes: 3 additions & 1 deletion .gitpod.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ tasks:
command: |
pre-commit install --install-hooks
nextflow self-update

- name: unset JAVA_TOOL_OPTIONS
command: |
unset JAVA_TOOL_OPTIONS
vscode:
extensions: # based on nf-core.nf-core-extensionpack
- codezombiech.gitignore # Language support for .gitignore files
Expand Down
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,26 @@
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## v1.1.3 - Augmented Akita Patch [2024-01-12]

### `Added`

- [#424](https://github.com/nf-core/taxprofiler/pull/424) Updated to nf-core pipeline template v2.11.1 (added by @LilyAnderssonLee & @sofstam)

### `Fixed`

- [#419](https://github.com/nf-core/taxprofiler/pull/419) Added improved syntax highlighting for tables in documentation (fix by @mashehu)
- [#421](https://github.com/nf-core/taxprofiler/pull/421) Updated the krakenuniq/preloadedkrakenuniq module that contained a fix for saving the output reads (❤️ to @SannaAb for reporting, fix by @Midnighter)
- [#427](https://github.com/nf-core/taxprofiler/pull/427) Fixed preprint information in the recommended methods text (fix by @jfy133)

### `Dependencies`

| Tool | Previous version | New version |
| ------------- | ---------------- | ----------- |
| multiqc | 1.15 | 1.19 |
| fastqc | 11.9 | 12.1 |
| nf-validation | unpinned | 1.1.3 |

## v1.1.2 - Augmented Akita Patch [2023-10-27]

### `Added`
Expand Down
15 changes: 5 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,11 +47,8 @@

## Usage

:::note
If you are new to Nextflow and nf-core, please refer to [this page](https://nf-co.re/docs/usage/installation) on how
to set-up Nextflow. Make sure to [test your setup](https://nf-co.re/docs/usage/introduction#how-to-run-a-pipeline)
with `-profile test` before running the workflow on actual data.
:::
> [!NOTE]
> If you are new to Nextflow and nf-core, please refer to [this page](https://nf-co.re/docs/usage/installation) on how to set-up Nextflow. Make sure to [test your setup](https://nf-co.re/docs/usage/introduction#how-to-run-a-pipeline) with `-profile test` before running the workflow on actual data.

First, prepare a samplesheet with your input data that looks as follows:

Expand Down Expand Up @@ -89,11 +86,9 @@ nextflow run nf-core/taxprofiler \
--run_kraken2 --run_metaphlan
```

:::warning
Please provide pipeline parameters via the CLI or Nextflow `-params-file` option. Custom config files including those
provided by the `-c` Nextflow option can be used to provide any configuration _**except for parameters**_;
see [docs](https://nf-co.re/usage/configuration#custom-configuration-files).
:::
> [!WARNING]
> Please provide pipeline parameters via the CLI or Nextflow `-params-file` option. Custom config files including those provided by the `-c` Nextflow option can be used to provide any configuration _**except for parameters**_;
> see [docs](https://nf-co.re/usage/configuration#custom-configuration-files).

For more details and further functionality, please refer to the [usage documentation](https://nf-co.re/taxprofiler/usage) and the [parameter documentation](https://nf-co.re/taxprofiler/parameters).

Expand Down
5 changes: 2 additions & 3 deletions assets/methods_description_template.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,6 @@ description: "Suggested text and references to use when describing pipeline usag
section_name: "nf-core/taxprofiler Methods Description"
section_href: "https://github.com/nf-core/taxprofiler"
plot_type: "html"
## TODO nf-core: Update the HTML below to your preferred methods description, e.g. add publication citation for this pipeline
## You inject any metadata in the Nextflow '${workflow}' object
data: |
<h4>Methods</h4>
<p>Data was processed using nf-core/taxprofiler v${workflow.manifest.version} ${doi_text} of the nf-core collection of workflows (<a href="https://doi.org/10.1038/s41587-020-0439-x">Ewels <em>et al.</em>, 2020</a>), utilising reproducible software environments from the Bioconda (<a href="https://doi.org/10.1038/s41592-018-0046-7">Grüning <em>et al.</em>, 2018</a>) and Biocontainers (<a href="https://doi.org/10.1093/bioinformatics/btx192">da Veiga Leprevost <em>et al.</em>, 2017</a>) projects.</p>
Expand All @@ -17,12 +15,13 @@ data: |
<li>Ewels, P. A., Peltzer, A., Fillinger, S., Patel, H., Alneberg, J., Wilm, A., Garcia, M. U., Di Tommaso, P., & Nahnsen, S. (2020). The nf-core framework for community-curated bioinformatics pipelines. Nature Biotechnology, 38(3), 276-278. doi: <a href="https://doi.org/10.1038/s41587-020-0439-x">10.1038/s41587-020-0439-x</a></li>
<li>Grüning, B., Dale, R., Sjödin, A., Chapman, B. A., Rowe, J., Tomkins-Tinch, C. H., Valieris, R., Köster, J., & Bioconda Team. (2018). Bioconda: sustainable and comprehensive software distribution for the life sciences. Nature Methods, 15(7), 475–476. doi: <a href="https://doi.org/10.1038/s41592-018-0046-7">10.1038/s41592-018-0046-7</a></li>
<li>da Veiga Leprevost, F., Grüning, B. A., Alves Aflitos, S., Röst, H. L., Uszkoreit, J., Barsnes, H., Vaudel, M., Moreno, P., Gatto, L., Weber, J., Bai, M., Jimenez, R. C., Sachsenberg, T., Pfeuffer, J., Vera Alvarez, R., Griss, J., Nesvizhskii, A. I., & Perez-Riverol, Y. (2017). BioContainers: an open-source and community-driven framework for software standardization. Bioinformatics (Oxford, England), 33(16), 2580–2582. doi: <a href="https://doi.org/10.1093/bioinformatics/btx192">10.1093/bioinformatics/btx192</a></li>
<li>Stamouli, S., Beber, M. E., Normark, T., Christensen, T. A., Andersson-Li, L., Borry, M., Jamy, M., nf-core community, & Fellows Yates, J. A. (2023). nf-core/taxprofiler: Highly parallelised and flexible pipeline for metagenomic taxonomic classification and profiling. (Preprint). bioRxiv 2023.10.20.563221. doi: <a href="https://doi.org/10.1101/2023.10.20.563221">10.1101/2023.10.20.563221</a></li>
${tool_bibliography}
</ul>
<div class="alert alert-info">
<h5>Notes:</h5>
<ul>
${nodoi_text}
${doi_text}
<li>The command above does not include parameters contained in any configs or profiles that may have been used. Ensure the config file is also uploaded with your publication!</li>
<li>You should also cite all software used within this run. Check the "Software Versions" of this report to get version information.</li>
</ul>
Expand Down
4 changes: 2 additions & 2 deletions assets/multiqc_config.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
report_comment: >
This report has been generated by the <a href="https://github.com/nf-core/taxprofiler/releases/tag/1.1.2" target="_blank">nf-core/taxprofiler</a>
This report has been generated by the <a href="https://github.com/nf-core/taxprofiler/releases/tag/1.1.3" target="_blank">nf-core/taxprofiler</a>
analysis pipeline. For information about how to interpret these results, please see the
<a href="https://nf-co.re/taxprofiler/1.1.2/docs/output" target="_blank">documentation</a>.
<a href="https://nf-co.re/taxprofiler/1.1.3/docs/output" target="_blank">documentation</a>.

report_section_order:
"nf-core-taxprofiler-methods-description":
Expand Down
2 changes: 1 addition & 1 deletion assets/slackreport.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
{
"fallback": "Plain-text summary of the attachment.",
"color": "<% if (success) { %>good<% } else { %>danger<%} %>",
"author_name": "nf-core/taxprofiler v${version} - ${runName}",
"author_name": "nf-core/taxprofiler ${version} - ${runName}",
"author_icon": "https://www.nextflow.io/docs/latest/_static/favicon.ico",
"text": "<% if (success) { %>Pipeline completed successfully!<% } else { %>Pipeline completed with errors<% } %>",
"fields": [
Expand Down
4 changes: 2 additions & 2 deletions conf/modules.config
Original file line number Diff line number Diff line change
Expand Up @@ -500,7 +500,7 @@ process {
publishDir = [
path: { "${params.outdir}/krakenuniq/${meta.db_name}/" },
mode: params.publish_dir_mode,
pattern: '*.{txt,fastq.gz}'
pattern: '*.{txt,fasta.gz}'
]
}

Expand Down Expand Up @@ -769,7 +769,7 @@ process {
}

withName: 'MULTIQC' {
ext.args = params.multiqc_title ? "--title \"$params.multiqc_title\"" : ''
ext.args = { params.multiqc_title ? "--title \"$params.multiqc_title\"" : '' }
publishDir = [
path: { "${params.outdir}/multiqc" },
mode: params.publish_dir_mode,
Expand Down
14 changes: 7 additions & 7 deletions docs/output.md
Original file line number Diff line number Diff line change
Expand Up @@ -375,23 +375,23 @@ You will only receive the `.fastq` and `*classifiedreads.txt` file if you supply

### KrakenUniq

[KrakenUniq](https://github.com/fbreitwieser/krakenuniq) (formerly KrakenHLL) is an extenson to the fast k-mer-based classification [Kraken](https://github.com/DerrickWood/kraken) with an efficient algorithm for additionally assessing the coverage of unique k-mers found in each species in a dataset.
[KrakenUniq](https://github.com/fbreitwieser/krakenuniq) (formerly KrakenHLL) is an extension to the fast k-mer-based classification performed by [Kraken](https://github.com/DerrickWood/kraken) with an efficient algorithm for additionally assessing the coverage of unique k-mers found in each species in a dataset.

<details markdown="1">
<summary>Output files</summary>

- `krakenuniq/`
- `<db_name>/`
- `<sample_id>_<db_name>.classified.fastq.gz`: FASTQ file containing all reads that had a hit against a reference in the database for a given sample
- `<sample_id>_<db_name>.unclassified.fastq.gz`: FASTQ file containing all reads that did not have a hit in the database for a given sample
- `<sample_id>_<db_name>.report.txt`: A Kraken2-style report that summarises the fraction abundance, taxonomic ID, number of Kmers, taxonomic path of all the hits, with an additional column for k-mer coverage, that allows for more accurate distinguishing between false-positive/true-postitive hits
- `<sample_id>_<db_name>.classifiedreads.txt`: A list of read IDs and the hits each read had against each database for a given sample
- `<sample_id>_<db_name>[.merged].classified.fasta.gz`: Optional FASTA file containing all reads that had a hit against a reference in the database for a given sample. Paired-end input reads are merged in this output.
- `<sample_id>_<db_name>[.merged].unclassified.fasta.gz`: Optional FASTA file containing all reads that did not have a hit in the database for a given sample. Paired-end input reads are merged in this output.
- `<sample_id>_<db_name>.krakenuniq.report.txt`: A Kraken2-style report that summarises the fraction abundance, taxonomic ID, number of Kmers, taxonomic path of all the hits, with an additional column for k-mer coverage, that allows for more accurate distinguishing between false-positive/true-postitive hits.
- `<sample_id>_<db_name>.krakenuniq.classified.txt`: An optional list of read IDs and the hits each read had against each database for a given sample.

</details>

The main taxonomic classification file from KrakenUniq is the `*report.txt` file. This is an extension of the Kraken2 report with the additional k-mer coverage information that provides more information about the accuracy of hits.
The main taxonomic classification file from KrakenUniq is the `*.krakenuniq.report.txt` file. This is an extension of the Kraken2 report with the additional k-mer coverage information that provides more information about the accuracy of hits.

You will only receive the `.fastq` and `*classifiedreads.txt` file if you supply `--krakenuniq_save_reads` and/or `--krakenuniq_save_readclassification` parameters to the pipeline.
You will only receive the `.fasta.gz` and `*.krakenuniq.classified.txt` file if you supply `--krakenuniq_save_reads` and/or `--krakenuniq_save_readclassification` parameters to the pipeline.

:::info
The output system of KrakenUniq can result in other `stdout` or `stderr` logging information being saved in the report file, therefore you must check your report files before downstream use!
Expand Down
7 changes: 3 additions & 4 deletions docs/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,12 +44,11 @@ This samplesheet is then specified on the command line as follows:

The `sample` identifiers have to be the same when you have re-sequenced the same sample more than once e.g. to increase sequencing depth. The pipeline will concatenate different runs FASTQ files of the same sample before performing profiling, when `--perform_runmerging` is supplied. Below is an example for the same sample sequenced across 3 lanes:

```console
```csv title="samplesheet.csv"
sample,run_accession,instrument_platform,fastq_1,fastq_2,fasta
2612,run1,ILLUMINA,2612_run1_R1.fq.gz,,
2612,run2,ILLUMINA,2612_run2_R1.fq.gz,,
2612,run3,ILLUMINA,2612_run3_R1.fq.gz,2612_run3_R2.fq.gz,

```

:::warning
Expand All @@ -62,7 +61,7 @@ The pipeline will auto-detect whether a sample is single- or paired-end using th

A final samplesheet file consisting of both single- and paired-end data, as well as long-read FASTA files may look something like the one below. This is for 6 samples, where `2612` has been sequenced twice.

```console
```csv title="samplesheet.csv"
sample,run_accession,instrument_platform,fastq_1,fastq_2,fasta
2611,ERR5766174,ILLUMINA,,,/<path>/<to>/fasta/ERX5474930_ERR5766174_1.fa.gz
2612,ERR5766176,ILLUMINA,/<path>/<to>/fastq/ERX5474932_ERR5766176_1.fastq.gz,/<path>/<to>/fastq/ERX5474932_ERR5766176_2.fastq.gz,
Expand Down Expand Up @@ -110,7 +109,7 @@ An example database sheet can look as follows, where 7 tools are being used, and

`kraken2` will be run twice even though only having a single 'dedicated' database because specifying `bracken` implies first running `kraken2` on the `bracken` database, as required by `bracken`.

```console
```csv
tool,db_name,db_params,db_path
malt,malt85,-id 85,/<path>/<to>/malt/testdb-malt/
malt,malt95,-id 90,/<path>/<to>/malt/testdb-malt.tar.gz
Expand Down
32 changes: 18 additions & 14 deletions lib/NfcoreTemplate.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

import org.yaml.snakeyaml.Yaml
import groovy.json.JsonOutput
import nextflow.extension.FilesEx

class NfcoreTemplate {

Expand Down Expand Up @@ -141,12 +142,14 @@ class NfcoreTemplate {
try {
if (params.plaintext_email) { throw GroovyException('Send plaintext e-mail, not HTML') }
// Try to send HTML e-mail using sendmail
def sendmail_tf = new File(workflow.launchDir.toString(), ".sendmail_tmp.html")
sendmail_tf.withWriter { w -> w << sendmail_html }
[ 'sendmail', '-t' ].execute() << sendmail_html
log.info "-${colors.purple}[$workflow.manifest.name]${colors.green} Sent summary e-mail to $email_address (sendmail)-"
} catch (all) {
// Catch failures and try with plaintext
def mail_cmd = [ 'mail', '-s', subject, '--content-type=text/html', email_address ]
if ( mqc_report.size() <= max_multiqc_email_size.toBytes() ) {
if ( mqc_report != null && mqc_report.size() <= max_multiqc_email_size.toBytes() ) {
mail_cmd += [ '-A', mqc_report ]
}
mail_cmd.execute() << email_html
Expand All @@ -155,14 +158,16 @@ class NfcoreTemplate {
}

// Write summary e-mail HTML to a file
def output_d = new File("${params.outdir}/pipeline_info/")
if (!output_d.exists()) {
output_d.mkdirs()
}
def output_hf = new File(output_d, "pipeline_report.html")
def output_hf = new File(workflow.launchDir.toString(), ".pipeline_report.html")
output_hf.withWriter { w -> w << email_html }
def output_tf = new File(output_d, "pipeline_report.txt")
FilesEx.copyTo(output_hf.toPath(), "${params.outdir}/pipeline_info/pipeline_report.html");
output_hf.delete()

// Write summary e-mail TXT to a file
def output_tf = new File(workflow.launchDir.toString(), ".pipeline_report.txt")
output_tf.withWriter { w -> w << email_txt }
FilesEx.copyTo(output_tf.toPath(), "${params.outdir}/pipeline_info/pipeline_report.txt");
output_tf.delete()
}

//
Expand Down Expand Up @@ -227,15 +232,14 @@ class NfcoreTemplate {
// Dump pipeline parameters in a json file
//
public static void dump_parameters(workflow, params) {
def output_d = new File("${params.outdir}/pipeline_info/")
if (!output_d.exists()) {
output_d.mkdirs()
}

def timestamp = new java.util.Date().format( 'yyyy-MM-dd_HH-mm-ss')
def output_pf = new File(output_d, "params_${timestamp}.json")
def filename = "params_${timestamp}.json"
def temp_pf = new File(workflow.launchDir.toString(), ".${filename}")
def jsonStr = JsonOutput.toJson(params)
output_pf.text = JsonOutput.prettyPrint(jsonStr)
temp_pf.text = JsonOutput.prettyPrint(jsonStr)

FilesEx.copyTo(temp_pf.toPath(), "${params.outdir}/pipeline_info/params_${timestamp}.json")
temp_pf.delete()
}

//
Expand Down
Loading