Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: EUKulele: Taxonomic annotation of the unsung eukaryotic microbes #2817

Closed
40 tasks done
whedon opened this issue Nov 4, 2020 · 68 comments
Closed
40 tasks done
Assignees
Labels
accepted Batchfile published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@whedon
Copy link

whedon commented Nov 4, 2020

Submitting author: @akrinos (Arianna Krinos)
Repository: https://github.com/AlexanderLabWHOI/EUKulele
Version: v1.0.2b
Editor: @will-rowe
Reviewer: @johanneswerner, @jcmcnch
Archive: 10.5281/zenodo.4422091

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/b6b7999944beedba3e3a4d391fd3180c"><img src="https://joss.theoj.org/papers/b6b7999944beedba3e3a4d391fd3180c/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/b6b7999944beedba3e3a4d391fd3180c/status.svg)](https://joss.theoj.org/papers/b6b7999944beedba3e3a4d391fd3180c)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@johanneswerner & @jcmcnch, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @will-rowe know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @johanneswerner

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@akrinos) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @jcmcnch

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@akrinos) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Nov 4, 2020

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @johanneswerner, @jcmcnch it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Nov 4, 2020

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@whedon
Copy link
Author

whedon commented Nov 4, 2020

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.7287/peerj.preprints.27295v1 is OK
- 10.1038/s41564-018-0176-9 is OK
- 10.1038/s41467-017-02342-1 is OK
- 10.1101/2020.06.30.180687 is OK
- 10.5281/zenodo.1476236 is OK
- 10.1051/0004-6361/201629272 is OK
- 10.1051/0004-6361/201322068 is OK

MISSING DOIs

- 10.1142/s0219720012500151 may be a valid DOI for title: Metagenomic taxonomic classification using extreme learning machines
- 10.1038/ncomms11257 may be a valid DOI for title: Fast and sensitive taxonomic classification for metagenomics with Kaiju
- 10.1111/1755-0998.13147 may be a valid DOI for title: A metagenomic assessment of microbial eukaryotic diversity in the global ocean
- 10.1038/ismej.2015.30 may be a valid DOI for title: Metatranscriptomic census of active protists in soils
- 10.1038/nrmicro.2016.160 may be a valid DOI for title: Probing the evolution, ecology and physiology of marine protists using transcriptomics
- 10.1093/database/baaa051 may be a valid DOI for title: SAGER: a database of Symbiodiniaceae and Algal Genomic Resource
- 10.1111/jpy.12529 may be a valid DOI for title: Robust Dinoflagellata phylogeny inferred from public transcriptome databases
- 10.1093/database/baaa051 may be a valid DOI for title: SAGER: a database of Symbiodiniaceae and Algal Genomic Resource
- 10.1016/j.tim.2018.10.009 may be a valid DOI for title: Are we overestimating protistan diversity in nature?
- 10.1093/nar/gks1160 may be a valid DOI for title: The Protist Ribosomal Reference database (PR2): a catalog of unicellular eukaryote small sub-unit rRNA sequences with curated taxonomy
- 10.1016/j.tree.2014.03.006 may be a valid DOI for title: The others: our biased perspective of eukaryotic genomes
- 10.1371/journal.pbio.2005849 may be a valid DOI for title: EukRef: Phylogenetic curation of ribosomal RNA to enhance understanding of eukaryotic diversity and distribution
- 10.1093/gigascience/giy158 may be a valid DOI for title: Re-assembly, quality evaluation, and annotation of 678 microbial eukaryotic reference transcriptomes
- 10.1007/978-3-319-61510-3_4 may be a valid DOI for title: Functional analysis in metagenomics using MEGAN 6
- 10.1007/978-1-4939-3369-3_13 may be a valid DOI for title: MG-RAST, a metagenomics service for analysis of microbial community structure and function
- 10.1016/j.gpb.2015.08.003 may be a valid DOI for title: The Tara Oceans project: new opportunities and greater challenges ahead
- 10.1038/sdata.2017.203 may be a valid DOI for title: The reconstruction of 2,631 draft metagenome-assembled genomes from the global oceans
- 10.1093/bioinformatics/btw445 may be a valid DOI for title: SWORD—a highly efficient protein database search
- 10.1038/nmeth.3176 may be a valid DOI for title: Fast and sensitive protein alignment using DIAMOND
- 10.1101/2020.06.30.180687 may be a valid DOI for title: EukProt: a database of genome-scale predicted proteins across the diversity of eukaryotic life
- 10.1371/journal.pone.0016342 may be a valid DOI for title: How and why DNA barcodes underestimate the diversity of microbial eukaryotes
- 10.1038/ncomms12860 may be a valid DOI for title: Adaptive radiation by waves of gene transfer leads to fine-scale resource partitioning in marine microbes
- 10.1111/gcb.12983 may be a valid DOI for title: Bridging the gap between omics and earth system science to better understand how environmental change impacts marine microbes
- 10.1098/rstb.2015.0331 may be a valid DOI for title: Censusing marine eukaryotic diversity in the twenty-first century
- 10.1007/978-3-030-38281-0_12 may be a valid DOI for title: Eukaryotic Pangenomes
- 10.1038/nature12221 may be a valid DOI for title: Pan genome of the phytoplankton Emiliania underpins its global distribution
- 10.1128/aem.01541-09 may be a valid DOI for title: Introducing mothur: open-source, platform-independent, community-supported software for describing and comparing microbial communities

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Nov 11, 2020

👋 @jcmcnch, please update us on how your review is going.

@whedon
Copy link
Author

whedon commented Nov 11, 2020

👋 @johanneswerner, please update us on how your review is going.

@johanneswerner
Copy link

johanneswerner commented Nov 17, 2020

Very interesting software package for the analysis of eukaroytes in metagenomes and metatranscriptomes. I like the focus of this tool and the well-written article and documentation, especially the very comprehensive documentation including all explanations and citations.

I have a few comments that might still be addressed.

  • installation

    • I don't know if this can be improved, but the installation via conda (as described here), takes a lot of time.
  • documentation

    • the links :ref:documentation and :ref:Parameters are not working in running-eukulele.rst
    • databaseandconfig.rst: there are four not three databases
  • minimal working example:

EUKulele --config curr_config.yaml 
 
Running EUKulele with entries from the provided configuration file.
No BUSCO file specified/found; using argument-specified organisms and taxonomy for BUSCO analysis.
Setting things up...
Found database folder for reference_DIR in current directory; will not re-download.
Creating a diamond reference from database files...
Aligning to reference database...
['samples_MAGs/sample_2.faa', 'samples_MAGs/sample_1.faa', 'samples_MAGs/sample_0.faa']
Aligning sample sample_2...
Aligning sample sample_1...
Aligning sample sample_0...
Diamond process exited for sample sample_2.
Diamond process exited for sample sample_1.
Diamond process exited for sample sample_0.
Performing taxonomic estimation steps...
Performing taxonomic visualization steps...
Performing taxonomic assignment steps...
Performing BUSCO steps...
Configuring BUSCO...
Running busco with 2 simultaneous jobs...
BUSCO error log:
Traceback (most recent call last):

  File "/home/ubuntu/miniconda3/envs/EUKulele/bin/busco_configurator.py", line 15, in <module>

    for line in open(sys.argv[1]):

FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/.local/bin/../config/config.ini'

sed: can't read test_out_23July/busco/config_sample_1.ini: No such file or directory

sed: can't read test_out_23July/busco/config_sample_1.ini: No such file or directory

sed: can't read test_out_23July/busco/config_sample_1.ini: No such file or directory

ERROR:  Config file test_out_23July/busco/config_sample_2.ini cannot be found

ERROR:  BUSCO analysis failed !

ERROR:  Check the logs, read the user guide, and check the BUSCO issue board on https://gitlab.com/ezlab/busco/issues

BUSCO output log:
python3 busco_configurator.py /home/ubuntu/.local/bin/../config/config.ini test_out_23July/busco/config_sample_1.ini

INFO:   ***** Start a BUSCO v4.1.4 analysis, current time: 11/17/2020 10:19:54 *****

INFO:   Configuring BUSCO with test_out_23July/busco/config_sample_2.ini

BUSCO error log:
ERROR:  Config file test_out_23July/busco/config_sample_1.ini cannot be found

ERROR:  BUSCO analysis failed !

ERROR:  Check the logs, read the user guide, and check the BUSCO issue board on https://gitlab.com/ezlab/busco/issues

BUSCO output log:
INFO:   ***** Start a BUSCO v4.1.4 analysis, current time: 11/17/2020 10:19:54 *****

INFO:   Configuring BUSCO with test_out_23July/busco/config_sample_1.ini

BUSCO error log:
Traceback (most recent call last):

  File "/home/ubuntu/miniconda3/envs/EUKulele/bin/busco_configurator.py", line 15, in <module>

    for line in open(sys.argv[1]):

FileNotFoundError: [Errno 2] No such file or directory: '/home/ubuntu/.local/bin/../config/config.ini'

sed: can't read test_out_23July/busco/config_sample_0.ini: No such file or directory

sed: can't read test_out_23July/busco/config_sample_0.ini: No such file or directory

sed: can't read test_out_23July/busco/config_sample_0.ini: No such file or directory

ERROR:  Config file test_out_23July/busco/config_sample_0.ini cannot be found

ERROR:  BUSCO analysis failed !

ERROR:  Check the logs, read the user guide, and check the BUSCO issue board on https://gitlab.com/ezlab/busco/issues

BUSCO output log:
python3 busco_configurator.py /home/ubuntu/.local/bin/../config/config.ini test_out_23July/busco/config_sample_0.ini

INFO:   ***** Start a BUSCO v4.1.4 analysis, current time: 11/17/2020 10:19:55 *****

INFO:   Configuring BUSCO with test_out_23July/busco/config_sample_0.ini

[] is what is in BUSCO directory
BUSCO initial run did not complete successfully.
Please check the BUSCO run log files in the log/ folder.
  • tests
    • pytest tests/ returns one failed test
______________________________________________________ ERROR collecting tests/setupanddownload/test_database.py ______________________________________________________
ImportError while importing test module '/data/EUKulele/tests/setupanddownload/test_database.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/home/ubuntu/miniconda3/envs/EUKulele/lib/python3.6/importlib/__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/setupanddownload/test_database.py:8: in <module>
    from EUKuleleconfig import *
E   ModuleNotFoundError: No module named 'EUKuleleconfig'
  • code quality

    • running pylint (pylint $(git ls-files '*.py')) on the repository returns a score of 3.07. I would try to get the pylint score to >=8 (and most warnings are easy to fix).
  • minor comments

  • a few comments about the manuscript

    • How would someone deal with a sample that contains both prokaryotes and eukaryotes? Is it also possible to analyze prokaryotes or is this not desired (if it is possible, how)?
    • Figure 1: try to avoid arrows that are overlapping the text
    • The manuscript lists three databases, however in the documentation four databases are listed (EukZoo is missing in the manuscript, see also in databaseandconfig.rst)
    • benchmarking: how long does EUKulele run depending on the size of a dataset/number of sequences? can it be estimated how long a dataset of a certain size will run?
    • majority of the references do not have dois linked
    • state of the field: EUKulele is compared to the tools MEGAN and MG-RAST which are as far as I know mostly used for the analyses of prokaryotes (and I believe mostly for metagenomics) - maybe there are better tools to compare EUKulele with

@will-rowe
Copy link

Great - thanks @johanneswerner!

Can you please respond to these comments when you get the chance @akrinos.

@jcmcnch - can you let us know how your are getting on please?

@akrinos
Copy link

akrinos commented Nov 18, 2020

@johanneswerner Thank you so much for the very helpful review!

I will respond to what I have responses for thus far and update as additional comments are addressed.

  • Installation: The conda installation indeed is quite slow - I am hoping to go through the process of adding it to the bioconda channel after publishing the paper, and am hopeful that that will provide a speedup over my user channel.
  • Documentation: Thank you for your helpful edits; I have merged those in. Thank you also for pointing out the link issues, which should be fixed now (some of the labels are a bit awkward, which I will fix, but the links I think I converted).
  • Minimal working example: I will need to explore your BUSCO error. I have not been working in BUSCO 4.1.4 previously. It may be something where I need to specify that the older version of BUSCO
  • Tests: this test is defunct and no longer run by Travis in that form; I have removed it from the tests folder
  • Code quality: Working through the formatting issues! Will let you know once I have the numbers up
  • I have merged both of the pull requests associated with the suggestions, as well as fixed the documentation inconsistency recommended by another user of the repository

Other Questions

  • One of our databases, phylodb, actually includes prokaryotes, so that is one option, but if prokaryotes were your group of interest, you would probably want to include your own database that is more complete. Beyond that, though, EUKulele should work fine on such a sample, although it has specific things built in (e.g. the databases we've chosen) tailored towards eukaryotes
  • I will modify the flowchart to fix this and update this thread when that is done
  • EukZoo was a recent addition; it is not tested on Travis yet, so apologies for the inconsistencies in where it is included!
  • Benchmarking is tricky because it's heavily dependent on available memory. However, at a given memory allocation, I can update you with some numbers on estimated runtimes
  • Working on linking the DOIs! I saw that in the initial check on the repository

@akrinos
Copy link

akrinos commented Nov 23, 2020

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Nov 23, 2020

PDF failed to compile for issue #2817 with the following error:

Error reading bibliography file paper.bib:
(line 461, column 3):
unexpected "b"
expecting space, ",", white space or "}"
Looks like we failed to compile the PDF

@akrinos
Copy link

akrinos commented Nov 23, 2020

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Nov 23, 2020

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@akrinos
Copy link

akrinos commented Nov 23, 2020

As an update to the above:

  • DOIs are now included for all references for which DOIs are available
  • The score returned by pylint $(git ls-files '*.py') is now 8.26/10
  • The flowchart arrows have been moved such that they are no longer covering text on the documentation landing page, as well as in the paper
  • All four default databases are now referenced in the paper

We are working on benchmarking and addressing BUSCO-related issues. Thank you for your patience!

@akrinos
Copy link

akrinos commented Nov 23, 2020

@johanneswerner I was able to reproduce the error you have been getting from BUSCO when trying to run the sample_EUKulele example folder.

It is indeed related to BUSCO version 4.1.4, in which version the BUSCO configuration file is stored in a different location than it was previously. I have implemented a patch that has been deployed to the conda build of EUKulele that searches for the configuration file in a different way. This solves the issue of the initial BUSCO run, but the storage location of the final BUSCO sequences will still be different due to the version change. For now, EUKulele can be run with BUSCO version 4.0.6 and Biopython 1.77. I will sort out the final versioning issues on the pip install version such that both versions should work; for now at least the BUSCO run itself functions.

Thanks again for your patience!

@akrinos
Copy link

akrinos commented Nov 25, 2020

@johanneswerner we have provided a graphic below using the DIAMOND alignment tool for various sizes of sequence files. The time is in minutes in terms of how long the full EUKulele run takes to execute. Note that this is for metatranscriptomic (MET) sequences with or without using the TransDecoder tool for translation (as well as colored by two different database selections, the MMETSP and PhyloDB).

We have also uploaded pip and conda-installable revisions of EUKulele which address the issue you encountered with the latest BUSCO version. In recently-uploaded version 1.0.1, these issues should be resolved, and you should be able to fully execute the small test example which you reported a prior test of above.

Thank you!

image

@johanneswerner
Copy link

Dear @akrinos

thank you for your updates. I think I checked the above checkboxes that are taken care of (if I forgot something, please let me know).

Unfortunately, I still encountered errors with the minimum example (run.log) and some of the tests also throw errors on my virtual instance (tests.log). Could you please have a look at them?

Thank you very much for your effort, especially the benchmarking is very interesting.

@akrinos
Copy link

akrinos commented Nov 26, 2020

Hmm, it looks like you're still getting the same error, which is most likely the cause for the failed tests as well (although I haven't looked carefully at each failure). Did you reinstall via conda @johanneswerner ? It looks like from the error that it is defaulting to using the BUSCO install that you have locally, rather than a BUSCO install via conda. Could you please try running EUKulele --version? The problem is also in the included scripts run_busco.sh and concatenate_busco.sh, so printing the result of cat $(which run_busco.sh) and cat $(which concatenate_busco.sh) to a file would also help me verify that the fix that I have added is present in the files that your install is pulling. One problem I had was needing to remove prior installs.

If this continues to be an issue, I suppose we should move to another thread per the guidelines. Thanks for your persistence!

@johanneswerner
Copy link

My apologies @akrinos, I pulled the git repository for the tests, but I forgot to reinstall via conda. Thank you for looking into it. :-)

Test dataset runs accordingly after reinstallation with conda, and the tests also pass. I marked the respective checkboxes above.

@akrinos
Copy link

akrinos commented Nov 27, 2020

Thank you @johanneswerner! Did you still have one failed test as above (checkbox in initial review)? With regard to the last two remaining checkboxes, for the analysis of prokaryotes, as mentioned in 729306579, we have one default database that includes prokaryotes, and generally users can curate their own datasets including prokaryotes, we have just tailored the tool to eukaryotes. As far as other software to compare ours too, one other tool I found was CCMetagen, published earlier this year. This tool identifies eukaryotes in metagenomic samples, but is not for metatranscriptomes and only uses the NCBI database. It might be useful to point out how our approach is different from this one, which also compares itself to MEGAN. If it helps, I could include both of these explanations in either the text or the documentation, whichever seems more helpful. I think other than that, everything from your review has been addressed.

Thanks again!

@will-rowe
Copy link

Thank you for your comprehensive review @johanneswerner - this is shaping up nicely.

Pinging @jcmcnch - are you still able to review this submission? Please let us know either way ASAP

@jcmcnch
Copy link

jcmcnch commented Dec 8, 2020

Hi @will-rowe @akrinos sorry for not getting back to you both sooner with this. I have been busy until recently and had unsubscribed from notifications (because I was getting about a dozen notifications from JOSS daily from unrelated reviews - perhaps something can be done by JOSS to prevent this). I am back on the case now, and will provide my comments ASAP, by the end of this week at the latest.

@akrinos
Copy link

akrinos commented Jan 7, 2021

Hi @will-rowe, thanks and no problem! I couldn't figure out how to edit the author list before. I ended up having to modify it to be release 1.0.2b on Zenodo here; hopefully that's okay!

@will-rowe
Copy link

@whedon set 10.5281/zenodo.4422091 as archive

@whedon
Copy link
Author

whedon commented Jan 7, 2021

OK. 10.5281/zenodo.4422091 is the archive.

@will-rowe
Copy link

@whedon set v1.0.2b as version

@whedon
Copy link
Author

whedon commented Jan 7, 2021

OK. v1.0.2b is the version.

@will-rowe
Copy link

@whedon accept

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Jan 7, 2021
@whedon
Copy link
Author

whedon commented Jan 7, 2021

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Jan 7, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2016

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2016, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Jan 7, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1093/nar/gkx1036 is OK
- 10.1186/s13059-020-02014-2 is OK
- 10.7287/peerj.preprints.27295v1 is OK
- 10.1038/s41564-018-0176-9 is OK
- 10.1038/s41467-017-02342-1 is OK
- 10.1101/2020.06.30.180687 is OK
- 10.5281/zenodo.1476236 is OK
- 10.1142/S0219720012500151 is OK
- 10.1038/ncomms11257 is OK
- 10.1111/1755-0998.13147 is OK
- 10.1038/ismej.2015.30 is OK
- 10.1038/nrmicro.2016.160 is OK
- 10.1093/database/baaa051 is OK
- 10.1111/jpy.12529 is OK
- 10.1038/s41564-019-0502-x is OK
- 10.1016/j.tim.2018.10.009 is OK
- 10.1093/nar/gks1160 is OK
- 10.1016/j.tree.2014.03.006 is OK
- 10.1371/journal.pbio.2005849 is OK
- 10.1093/gigascience/giy158 is OK
- 10.1007/978-3-319-61510-3_4 is OK
- 10.1093/bioinformatics/btv351 is OK
- 10.17226/4901 is OK
- 10.1007/978-3-319-60156-4_18 is OK
- 10.1101/gr.229202 is OK
- 10.1016/j.gpb.2015.08.003 is OK
- 10.1038/sdata.2017.203 is OK
- 10.1093/bioinformatics/btw445 is OK
- 10.1038/nmeth.3176 is OK
- 10.1101/2020.06.30.180687 is OK
- 10.1371/journal.pbio.1001889 is OK
- 10.1371/journal.pone.0016342 is OK
- 10.1016/j.cub.2017.01.017 is OK
- 10.1038/ncomms12860 is OK
- 10.1111/gcb.12983 is OK
- 10.1098/rstb.2015.0331 is OK
- 10.1007/978-3-030-38281-0_12 is OK
- 10.1038/nature12221 is OK
- 10.1038/nmeth.4197 is OK
- 10.1128/AEM.01541-09 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@arfon
Copy link
Member

arfon commented Jan 7, 2021

@akrinos - could you please merge this PR before we proceed?

@akrinos
Copy link

akrinos commented Jan 7, 2021

I merged it in, thanks @arfon!

@arfon
Copy link
Member

arfon commented Jan 8, 2021

@whedon accept

@whedon
Copy link
Author

whedon commented Jan 8, 2021

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Jan 8, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1093/nar/gkx1036 is OK
- 10.1186/s13059-020-02014-2 is OK
- 10.7287/peerj.preprints.27295v1 is OK
- 10.1038/s41564-018-0176-9 is OK
- 10.1038/s41467-017-02342-1 is OK
- 10.1101/2020.06.30.180687 is OK
- 10.5281/zenodo.1476236 is OK
- 10.1142/S0219720012500151 is OK
- 10.1038/ncomms11257 is OK
- 10.1111/1755-0998.13147 is OK
- 10.1038/ismej.2015.30 is OK
- 10.1038/nrmicro.2016.160 is OK
- 10.1093/database/baaa051 is OK
- 10.1111/jpy.12529 is OK
- 10.1038/s41564-019-0502-x is OK
- 10.1016/j.tim.2018.10.009 is OK
- 10.1093/nar/gks1160 is OK
- 10.1016/j.tree.2014.03.006 is OK
- 10.1371/journal.pbio.2005849 is OK
- 10.1093/gigascience/giy158 is OK
- 10.1007/978-3-319-61510-3_4 is OK
- 10.1093/bioinformatics/btv351 is OK
- 10.17226/4901 is OK
- 10.1007/978-3-319-60156-4_18 is OK
- 10.1101/gr.229202 is OK
- 10.1016/j.gpb.2015.08.003 is OK
- 10.1038/sdata.2017.203 is OK
- 10.1093/bioinformatics/btw445 is OK
- 10.1038/nmeth.3176 is OK
- 10.1101/2020.06.30.180687 is OK
- 10.1371/journal.pbio.1001889 is OK
- 10.1371/journal.pone.0016342 is OK
- 10.1016/j.cub.2017.01.017 is OK
- 10.1038/ncomms12860 is OK
- 10.1111/gcb.12983 is OK
- 10.1098/rstb.2015.0331 is OK
- 10.1007/978-3-030-38281-0_12 is OK
- 10.1038/nature12221 is OK
- 10.1038/nmeth.4197 is OK
- 10.1128/AEM.01541-09 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Jan 8, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2021

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2021, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@arfon
Copy link
Member

arfon commented Jan 8, 2021

@whedon accept deposit=true

@whedon whedon added accepted published Papers published in JOSS labels Jan 8, 2021
@whedon
Copy link
Author

whedon commented Jan 8, 2021

Doing it live! Attempting automated processing of paper acceptance...

@whedon
Copy link
Author

whedon commented Jan 8, 2021

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Jan 8, 2021

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.02817 joss-papers#2022
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02817
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@arfon
Copy link
Member

arfon commented Jan 8, 2021

@johanneswerner, @jcmcnch - many thanks for your reviews here and to @will-rowe for editing this submission. JOSS relies upon the volunteer efforts of folks like yourselves and we simply couldn't do this without you! ✨

@akrinos - your paper is now accepted into JOSS ⚡🚀💥

@arfon arfon closed this as completed Jan 8, 2021
@whedon
Copy link
Author

whedon commented Jan 8, 2021

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02817/status.svg)](https://doi.org/10.21105/joss.02817)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02817">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02817/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02817/status.svg
   :target: https://doi.org/10.21105/joss.02817

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Batchfile published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

6 participants