Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: EFAtools: An R package with fast and flexible implementations of exploratory factor analysis tools #2521

Closed
40 tasks done
whedon opened this issue Jul 26, 2020 · 55 comments
Assignees
Labels
accepted C++ published Papers published in JOSS R recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@whedon
Copy link

whedon commented Jul 26, 2020

Submitting author: @mdsteiner (Markus Steiner)
Repository: https://github.com/mdsteiner/EFAtools
Version: v0.1.1
Editor: @fboehm
Reviewers: @jacobsoj, @chainsawriot
Archive: 10.5281/zenodo.4032509

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/e30bc75fc9ade6455593b9af4539a5b3"><img src="https://joss.theoj.org/papers/e30bc75fc9ade6455593b9af4539a5b3/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/e30bc75fc9ade6455593b9af4539a5b3/status.svg)](https://joss.theoj.org/papers/e30bc75fc9ade6455593b9af4539a5b3)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@jacobsoj & @chainsawriot, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @fboehm know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @jacobsoj

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@mdsteiner) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @chainsawriot

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@mdsteiner) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Jul 26, 2020

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @jacobsoj, @chainsawriot it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jul 26, 2020

Reference check summary:

OK DOIs

- 10.1177/1073191119845051 is OK
- 10.1007/BF02289209 is OK
- 10.7275/jyj1-4868 is OK
- 10.1177/0095798418771807 is OK
- 10.7287/peerj.preprints.3188v1 is OK
- 10.1016/j.csda.2013.02.005 is OK
- 10.1037/met0000200 is OK
- 10.1037/met0000074 is OK
- 10.1037/a0025697 is OK
- 10.1080/00273171.2011.564527 is OK
- 10.18637/jss.v048.i02 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Jul 26, 2020

@chainsawriot
Copy link

Thanks @mdsteiner & Co. for this wonderful package.

Software paper

The paper is well written. But I don't know the policy of JOSS enough about citing "Manuscript in preparation". (Could @fboehm confirm this?) Obviously, the two authors are probably the only persons having that paper now. I can't 'see Grieder & Steiner, 2020', as per the in-text citation.

I have 2 suggestions:

  1. Put it as a preprint and then cite it.

  2. If it can't be put as a preprint (we know how these social science journals work...), maybe just say

    the ability to reproduce the R psych (Revelle, 2019) and SPSS (IBM, 2015) implementations of some analyses methods, as well as the inclusion of experimental implementations for these methods based on simulation analyses by the authors (pending publication).

Package

Functions

I think the documentation and the vignette are very easy to read. I can understand the package very quickly. All methods are well referenced. The crayon-based output is a nice touch.

The package is feature-rich and it deals with all aspects of EFA. I especially like the function N_FACTORS becuse it helps to solve the problem of choosing n_factors

The package tries to match the SPSS's offerings. I only have one suggestion (which might be too much to ask): It would be nice to have the ability to plot screeplot. Biplot might be nice too. Some reviewers would ask for it.

Misc

  • Please remove .DS_Store from your repo. I don't know why your .gitignore doesn't work. But the following should at least do the trick of finding all .DS_Store and ignore them.
find . -name .DS_Store -print0 | xargs -0 git rm -f --ignore-unmatch

Advice from goodpractice::gp(). You don't need to consider all of the points. But I think some are useful and less opinionated.

  • Test coverage is 65% which is okay. But some important lines are not tested. e.g.

    • R/BARTLETT.R:107 (corner cases that require cor.smooth)
    • R/CD.R:236, 295, 303 (corner cases when shared_load[1,1] < 0)
  • exportPattern is generated in NAMESPACE from this line Is it really needed? It also makes me wonder, do you need to export all helper functions? (e.g. .numformat)

  • Adding URL / Bugreports to the DESCRIPTION file would be helpful for your users to find your github repo.

@fboehm
Copy link

fboehm commented Jul 27, 2020

Regarding comments from @chainsawriot about the manuscript and the manuscript in preparation, it would be ideal if the authors can post a preprint and cite it, but, if that's not possible, then I like the other suggestion from @chainsawriot, too.

@mdsteiner
Copy link

Thank you @chainsawriot for the fast review and the many helpful comments and suggestions! Please find below our responses to your raised points.

Software paper

The paper is well written. But I don't know the policy of JOSS enough about citing "Manuscript in preparation". (Could @fboehm confirm this?) Obviously, the two authors are probably the only persons having that paper now. I can't 'see Grieder & Steiner, 2020', as per the in-text citation.

I have 2 suggestions:

  • Put it as a preprint and then cite it.
  • If it can't be put as a preprint (we know how these social science journals work...), maybe just say
    the ability to reproduce the R psych (Revelle, 2019) and SPSS (IBM, 2015) implementations of some analyses methods, as well as the inclusion of experimental implementations for these methods based on simulation analyses by the authors (pending publication).

Response:
As the paper is not yet in a form we are comfortable to share as a preprint, we would, for now, like to opt for the second suggestion made by the reviewer. We changed the respective sentence in the paper accordingly. However, we hope to have a shareable version ready in time to still be able to include a reference to a preprint here.

To demonstrate that our package enables replication of the R psych and SPSS solutions, we added a vignette to the package and refer to it in the paper as well.

Package

Functions

I think the documentation and the vignette are very easy to read. I can understand the package very quickly. All methods are well referenced. The crayon-based output is a nice touch.

The package is feature-rich and it deals with all aspects of EFA. I especially like the function N_FACTORS becuse it helps to solve the problem of choosing n_factors

The package tries to match the SPSS's offerings. I only have one suggestion (which might be too much to ask): It would be nice to have the ability to plot screeplot. Biplot might be nice too. Some reviewers would ask for it.

Response:
Thank you for this feedback! We include a scree plot in the KGC and PARALLEL functions, but these plots are not shown if KGC or PARALLEL are called within the N_FACTORS function. To facilitate getting a scree plot, we now added a function SCREE which just creates a scree plot / scree plots of the eigenvalues determined with either "PCA", "SMC" or "EFA", or a combination of them. This function can also be called in N_FACTORS now, just like the other factor retention criteria.

Furthermore, we also added a function FACTOR_SCORES which is a wrapper for psych::factor.scores to be used directly with an output from EFA.

We also like the idea of adding the option to do a biplot and would like to consider this as a future enhancement of our package.

Misc

Please remove .DS_Store from your repo. I don't know why your .gitignore doesn't work. But the following should at least do the trick of finding all .DS_Store and ignore them.
find . -name .DS_Store -print0 | xargs -0 git rm -f --ignore-unmatch

Response:
Thank you for this very helpful suggestion, we now removed all .DS_Store.

Advice from goodpractice::gp(). You don't need to consider all of the points. But I think some are useful and less opinionated.

Test coverage is 65% which is okay. But some important lines are not tested. e.g.
R/BARTLETT.R:107 (corner cases that require cor.smooth)
R/CD.R:236, 295, 303 (corner cases when shared_load[1,1] < 0)

Response:
Thank you for making us aware of that. We did not know the goodpractice package and think it is really helpful. We now added new tests for some helper functions that were not tested before, and updated many of the existing tests to cover more different cases. Test coverage is now 74.7%. We do not test print and plot functions (which make up most of the untested lines) as well as some very minor cases and cases which would bear the risk of unstable results (e.g., the occurence of Heywood cases or non-convergence when random data are generated).

exportPattern is generated in NAMESPACE from this line Is it really needed? It also makes me wonder, do you need to export all helper functions? (e.g. .numformat)

Response:
Thank you for pointing this out! We now deleted the exportPattern and export all functions explicitly. This way, the helper functions remain hidden.

Adding URL / Bugreports to the DESCRIPTION file would be helpful for your users to find your github repo.

Response:
Thank you for this suggestion! We have now added the respective lines in the DESCRIPTION file.

@chainsawriot
Copy link

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Aug 3, 2020

@chainsawriot
Copy link

Thank you so much for the revision, @mdsteiner. Both the paper and the software look fine to me now.

Back to you @fboehm .

@fboehm
Copy link

fboehm commented Aug 3, 2020

Thank you, @chainsawriot ! Just to verify - are you content with the documentation around community guidelines?

Thanks again!

@fboehm
Copy link

fboehm commented Aug 3, 2020

@jacobsoj - please let me know if you have any questions during your review. Thanks again!

@chainsawriot
Copy link

@fboehm The documentation around community guidelines looks fine to me. Really sorry that I have forgotten to check the box. The box is now checked. Thank you very much!

@fboehm
Copy link

fboehm commented Aug 17, 2020

@jacobsoj - I just wanted to ask if you have any questions about the review process. Please feel free to check the boxes as you examine the package and manuscript. Thanks again!

@jacobsoj
Copy link

jacobsoj commented Aug 17, 2020 via email

@fboehm
Copy link

fboehm commented Aug 17, 2020

Thanks, @jacobsoj ! I really appreciate your assistance with this.

Just to clarify - JOSS reviews are a little different from those at many journals. You'll want to see which items you can check off in the checklist above as you review the software and manuscript. For any boxes that you can't check right now, you'll want to tell the authors why you can't check the box, ie, what is missing or what is not working. The authors will then fix what you indicated. When they're satisfied that they've fixed the issues, they'll ask you to verify that you're satisfied with their work. The review is considered complete once all boxes are checked.

Thank you again, and please let me know how I might assist you as you work through the review.

@fboehm
Copy link

fboehm commented Aug 30, 2020

@jacobsoj - I just wanted to check in to see how things are going. I recognize that this is your first JOSS review, so it might be a little confusing. We primarily use the checklist that's available above to review the software and manuscript. Please let me know if you have any questions or concerns. Thanks again!!

@mdsteiner
Copy link

Thanks again @chainsawriot for the helpful comments! We have now uploaded a preprint of our paper where we report the simulation analyses concerning the default implementation of PAF and promax used in the EFA() function and have adapted our citation in the manuscript by citing the preprint as you suggested.

@chainsawriot
Copy link

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 1, 2020

@chainsawriot
Copy link

@mdsteiner Thanks for keeping me updated. It is really nice that the preprint is available for all of us. I hope that the review of both papers (this JOSS paper and your preprint) can go faster.

@fboehm
Copy link

fboehm commented Sep 8, 2020

@whedon add @jacobsoj as reviewer

@whedon
Copy link
Author

whedon commented Sep 14, 2020

OK, the reviewer has been re-invited.

@jacobsoj please accept the invite by clicking this link: https://github.com/openjournals/joss-reviews/invitations

@fboehm
Copy link

fboehm commented Sep 14, 2020

@jacobsoj - Please click on the link above to accept the invitation. I've added you again so that you can continue to participate in the review, should there be a need. We're nearly ready to accept the submission, so there may not be anything left for you to do. We're not sure why you mistakenly lost access to the repository. Sorry for the trouble.

@jacobsoj
Copy link

FWIW I reclicked on the 2 boxes although you had done that for me, just in case it matters later that I did it myself.

@fboehm
Copy link

fboehm commented Sep 15, 2020

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 15, 2020

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@fboehm
Copy link

fboehm commented Sep 15, 2020

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Sep 15, 2020

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@fboehm
Copy link

fboehm commented Sep 15, 2020

@mdsteiner - The reviewers have recommended publication of your submission. Before we can do that, I need to you to: 1. examine the article pdf proof for errors and 2. archive the package with, for example, zenodo or figshare. Please record and report the version number and archive DOI in this comments thread. Double-check that the metadata for the archived package has the same title and authors list as your JOSS manuscript.
Thanks again!

@mdsteiner
Copy link

Thanks @fboehm! We have checked the pdf proof and everything looks fine. The package is now archived as version 0.2.0 on zenodo with the archive DOI 10.5281/zenodo.4032509

Thanks again @chainsawriot and @jacobsoj for the many helpful comments and suggestions!

@fboehm
Copy link

fboehm commented Sep 16, 2020

@whedon set 10.5281/zenodo.4032509 as archive

@whedon
Copy link
Author

whedon commented Sep 16, 2020

OK. 10.5281/zenodo.4032509 is the archive.

@fboehm
Copy link

fboehm commented Sep 16, 2020

@whedon accept

@whedon
Copy link
Author

whedon commented Sep 16, 2020

Attempting dry run of processing paper acceptance...

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Sep 16, 2020
@whedon
Copy link
Author

whedon commented Sep 16, 2020

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1177/1073191119845051 is OK
- 10.1007/BF02289209 is OK
- 10.7275/jyj1-4868 is OK
- 10.1177/0095798418771807 is OK
- 10.7287/peerj.preprints.3188v1 is OK
- 10.1016/j.csda.2013.02.005 is OK
- 10.1037/met0000200 is OK
- 10.1037/met0000074 is OK
- 10.1037/a0025697 is OK
- 10.1080/00273171.2011.564527 is OK
- 10.18637/jss.v048.i02 is OK
- 10.31234/osf.io/7hwrm is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Sep 16, 2020

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#1731

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#1731, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@fboehm
Copy link

fboehm commented Sep 16, 2020

Thank you, @mdsteiner for the archiving. Thanks to @jacobsoj and @chainsawriot for excellent work on the reviews. The submission is now recommended for acceptance. The editors in chief will make a final review.

@kyleniemeyer
Copy link

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Sep 16, 2020

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added the published Papers published in JOSS label Sep 16, 2020
@whedon
Copy link
Author

whedon commented Sep 16, 2020

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Sep 16, 2020

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.02521 joss-papers#1733
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02521
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@kyleniemeyer
Copy link

Congrats @mdsteiner on your article's publication in JOSS!

Many thanks to @jacobsoj and @chainsawriot for reviewing this, and @fboehm for editing.

@whedon
Copy link
Author

whedon commented Sep 16, 2020

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02521/status.svg)](https://doi.org/10.21105/joss.02521)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02521">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02521/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02521/status.svg
   :target: https://doi.org/10.21105/joss.02521

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted C++ published Papers published in JOSS R recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

6 participants