Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: JMcDM: A Julia package for multiple-criteria decision-making tools #3430

Closed
40 tasks done
whedon opened this issue Jun 29, 2021 · 68 comments
Closed
40 tasks done
Assignees
Labels
accepted Julia published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@whedon
Copy link

whedon commented Jun 29, 2021

Submitting author: @jbytecode (Mehmet Hakan Satman)
Repository: https://github.com/jbytecode/JMcDM
Version: v0.2.4
Editor: @drvinceknight
Reviewer: @brunaw, @sylvaticus
Archive: 10.5281/zenodo.5534663

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/2f04c500e51f28e2273cb858f20c3eff"><img src="https://joss.theoj.org/papers/2f04c500e51f28e2273cb858f20c3eff/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/2f04c500e51f28e2273cb858f20c3eff/status.svg)](https://joss.theoj.org/papers/2f04c500e51f28e2273cb858f20c3eff)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@brunaw & @sylvaticus, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @drvinceknight know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @brunaw

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@jbytecode) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @sylvaticus

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@jbytecode) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Jun 29, 2021

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @brunaw, @sylvaticus it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jun 29, 2021

Software report (experimental):

github.com/AlDanial/cloc v 1.88  T=0.04 s (1181.6 files/s, 194919.0 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Julia                           30           1379            167           4300
TOML                             4            134              2            548
Markdown                         8            128              0            385
TeX                              1             42              0            355
Lisp                             3            133              0            295
YAML                             2              1              0             49
-------------------------------------------------------------------------------
SUM:                            48           1817            169           5932
-------------------------------------------------------------------------------


Statistical information for the repository '63ecc02f3f1ba5c2e5035f2e' was
gathered on 2021/06/29.
No commited files with the specified extensions were found.

@whedon
Copy link
Author

whedon commented Jun 29, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@whedon
Copy link
Author

whedon commented Jun 29, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1109/4235.996017 is OK
- 10.1016/j.softx.2019.02.004 is OK
- 10.1109/ieem.2009.5373124 is OK
- 10.1002/9780470400531.eorms0970 is OK
- 10.1016/s2212-5671(14)00342-6 is OK
- 10.1007/978-3-319-06647-9 is OK
- 10.1137/141000671 is OK
- 10.1007/978-3-642-48318-9_3 is OK
- 10.1051/ro/196802v100571 is OK
- 10.1111/1467-8667.00269 is OK
- 10.1016/0022-2496(77)90033-5 is OK
- 10.1016/s0167-6911(82)80025-x is OK
- 10.1016/0167-9236(89)90037-7 is OK
- 10.1287/opre.2.2.172 is OK
- 10.5755/j01.eee.122.6.1810 is OK
- 10.15388/informatica.2015.57 is OK
- 10.1016/j.eswa.2014.11.057 is OK
- 10.1016/j.cie.2019.106231 is OK
- 10.1108/md-05-2017-0458 is OK
- 10.1016/0305-0548(94)00059-h is OK
- 10.1002/j.1538-7305.1948.tb01338.x is OK
- 10.1016/0377-2217(78)90138-8 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@sylvaticus
Copy link

sylvaticus commented Jul 1, 2021

Sylvaticus review

The software follows the Julia standards for registered packages: it is easy to install, it has a clear licence, the methods are documented, and an extensive testing system covers much of the software.
I have a few minor/specific points concerning the documentation, the API and the JOSS paper in the sections that follows.

I have also two more general comments.

The first one is that while in the paper several other "similar" software are cited, still it is not clear the innovation bring by this package. Is it easier to use than the cited software? Is it more complete in terms of methods ? Is it faster for large problems (here a benchmark would be ideal) ? I believe that the package would benefit from a little more detailed "state of the field"/"statement of needs" (in the paper and/or in the online documentation)

The second general comment is somehow related. While it is impressive the number of methods supported, it is a bit like a list of completely unrelated tools.
I believe that in both the documentation and in the API design would be better to look for a taxonomy/way to group/organise the different methods.
One possible suggestion could be to have an interface function like MCDM(df,w,fnd;method=Topsis()) that wraps the various methods. In this way, the user would have to remember only one function and then eventually with ?MCDM get more info about the supported methods, rather than browse the documentation for the supported functions.
The function would have an optional method keyword that receive a method-specific structure with its parameters (e.g. v for vikor, or zeta for grey).
Also, the triplet (df,w,fnd) could be documented more in detail, but only once, and then each specific method would have its own documentation.
The same for the output… most methods return a structure with decisionMatrix, weights, scores, ranking, and bestIndex. I believe it should be better to create a common output with these fields (where they could be documented once) plus an object carrying the variable information. Note that using the suggested approach the MCDM function would remain type stable, as each method would be a different type, so the compiler would still specialise each "method".
Document the summary function could also be useful as well (if exist) some indications on when one method should be preferable to the others.

Following are my specific points.

Minor/specific points concerning the software

  • the optional parameters, like v in vikor and zeta in grey are sometimes positional argument sometimes keyword argument
  • in promethee maybe it is better to keep the first 3 parameters as in the other methods dm, w, fns and then add the specific argument parameter
  • use a default (all methods? the "recommended" ones ?) for the summary function
  • implement a print method for the results
  • What the function makeminmax does ? It seems it is not needed:
julia> fns2 = [minimum, maximum, minimum, minimum, maximum]
5-element Vector{Function}:
 minimum (generic function with 18 methods)
 maximum (generic function with 17 methods)
 minimum (generic function with 18 methods)
 minimum (generic function with 18 methods)
 maximum (generic function with 17 methods)

julia> fns = makeminmax([minimum, maximum, minimum, minimum, maximum])
5-element Vector{Function}:
 minimum (generic function with 18 methods)
 maximum (generic function with 17 methods)
 minimum (generic function with 18 methods)
 minimum (generic function with 18 methods)
 maximum (generic function with 17 methods)

julia> fns == fns2
true
  • What the undocumented makeDecisionMatrix function does other than transforming a matrix to a df ?
  • When apply the data of the paper to the moora function, I have all decisions with the same score. Maybe there should be a warning or a flag when the algorithm doesn't converge, or bestIndex should be rather be bestIndices with a singleton being a particular case (not only here in the moora function, but more in general)

Minor/specific points concerning the text of the JOSS paper

  • L1 (title): I don't see very well the "for" word going well with the "tools" word... maybe "A Julia package for multiple-criteria decision-making" or "A Julia collection of multiple-criteria decision-making tools"...
  • L11-13: I am not sure "the package" refers to Julia or to this package in particular. In the first case maybe better "The language", in the second case it is more to "use" rather than to "write" recently published methods. Also, you use the words "multiple-criteria decision-making" but then you use the acronym for "multiple-criteria decision-analysis" (MCDA).
  • L 10,52,71,78: "REPL" is cited (without definition) 4 times. I don't think it is necessary to focus so much on the REPL
  • L 55-64: Maybe order these methods, alphabetically or by year ??
  • I believe there is a misunderstanding with the "State of the field" section. While I like its content and I think it is important to give an idea of the subject covered in the software, I think that the section with that title is more for comparing the proposed package with the existing alternatives (from the reviewer checklist: "State of the field: Do the authors describe how this software compares to other commonly-used packages?")

Minor/specific points concerning the online documentation

  • It is useful to have the documentation for the stable (and all the different versions) and the development version separated. This can be done automatically using Documenter, see here for a tutorial and associated "dummy" package
  • I find also useful to have a link from the documentation back to the main page of the github repo (the "edit to Github" sends to a inner page")
  • Also, while the testing seems extensive, having a coverage section would help to highlight the parts that are still not covered by the tests. Again, refer to the same page for a template to add automatic coverage reports.
  • JMcDM.electre function documentation: the output is copy/pasted from the topsis function above, but it has not been adapted. What are C and D ?
  • damatel func documentation: comparisonMat::Array{Float,2}: n × m matrix of input values. What are the input values ? Also doc say Matrix, but the function wants a df. I can't get this function working with the df in the paper. It seems the matrix must be squared, or a dimension mismatch error happens.
  • AHP documentation: the arguments of the function in the given example is (df, vector of matrix), while in the docstring both are df
  • promethee docstring: the conversion of prefs is not needed
  • Some useful functions, like summary, are undocumented
  • The "community guidelines" for contributing to the software are missing
  • Please provide the reference to each method as a clickable link

@jbytecode
Copy link

Dear reviewer @sylvaticus

Thank you very much for your suggestions. Both the software and the paper are now better thanks to your comments. Here is the summary of my changes:

  • Added mcdm(df, w, fns, method) type dispatch. For example

julia> mcdm(df, w, fns, TopsisMethod())

now works. All of the methods are implemented in the same way and the methods that have optional parameters work similar like

julia> mcdm(df, w, fns, XXXMethod(paramer, parameter))

where XXXMethod inherits the type MCDMMethod. These implementation covers the method specific arguments, for example, for Promethee it is like

julia> mcdm(setting, PrometheeMethod(prefs, qs, ps))

  • makeminmax() added for this:
    TOPSIS: Error During Test at test.jl:104
    Got exception outside of a @test

MethodError: no method matching topsis(::DataFrame, ::Vector{Float64}, ::Vector{typeof(maximum)})
Closest candidates are:
topsis(::DataFrame, ::Vector{Float64}, ::Vector{Function})

When we use [maximum, maximum, maximum], julia recognizes the vector in type of Vector{typeof(maximum)} rather than Function. When the function array includes both, no errors are thrown. I left it as is.

  • v in vikor, zeta in grey etc. are now optional keyword arguments in a more standard way of implementation.

  • Added print(t <: MCDMResult) functions for all MCDM methods.

  • Definition of Promethee was fixed.

  • MCDMSetting type is defined and method(setting::MCDMSetting) is implemented
    for the methods that take df, w, fns as arguments. This type is documented as well. This is also current for the

julia> mcdm(setting, TopsisMethod())

type calls.

  • documentation added for summary(). summary(setting::MCDMSetting, methods) type function call is also current for this.

  • link to GitHub repo added in docs. New sections added to online docs.

  • A note has been added to documentation about C and D values of Electre method.

  • Dematel's input values are defined in the documentation.

  • Documentation added for undocumented makeDecisionMatrix(). It is right that is converts a matrix to a DataFrame object, additionally add some nice column headers. this function is not vital but it is useful.

  • A new documentation section is added for summary, mcdm, makeDecisionMatrix, etc.

  • The article has been updated based on the comments. Statement of need section is now comparing the implemented package with the cited ones. Citations are ordered alphabetically. Typo fixed for multiple citations in Vikor. Citations are types in the standard JOSS template.

  • Numerous additional test cases now covers the newly implemented interfaces for MCDMMethod, mcdm, MCDMSetting, etc.

  • A descriptive "community guidelines" was also added in the repo.

@jbytecode
Copy link

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jul 4, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@jbytecode
Copy link

@whedon check references

@whedon
Copy link
Author

whedon commented Jul 4, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1109/4235.996017 is OK
- 10.1016/j.softx.2019.02.004 is OK
- 10.1109/ieem.2009.5373124 is OK
- 10.1002/9780470400531.eorms0970 is OK
- 10.1016/s2212-5671(14)00342-6 is OK
- 10.1007/978-3-319-06647-9 is OK
- 10.1137/141000671 is OK
- 10.1007/978-3-642-48318-9_3 is OK
- 10.1051/ro/196802v100571 is OK
- 10.1111/1467-8667.00269 is OK
- 10.1016/0022-2496(77)90033-5 is OK
- 10.1016/s0167-6911(82)80025-x is OK
- 10.1016/0167-9236(89)90037-7 is OK
- 10.1287/opre.2.2.172 is OK
- 10.5755/j01.eee.122.6.1810 is OK
- 10.15388/informatica.2015.57 is OK
- 10.1016/j.eswa.2014.11.057 is OK
- 10.1016/j.cie.2019.106231 is OK
- 10.1108/md-05-2017-0458 is OK
- 10.1016/0305-0548(94)00059-h is OK
- 10.1002/j.1538-7305.1948.tb01338.x is OK
- 10.1016/0377-2217(78)90138-8 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@openjournals openjournals deleted a comment from whedon Jul 5, 2021
@jbytecode
Copy link

jbytecode commented Jul 6, 2021

Dear reviewer @sylvaticus

Recent changes had been pumped in v0.2.0. Today, I have just released a new version of v0.2.1 with new documentation and several changes/corrections in both repo, online docs, and source code. The new version is ready to update in Julia.

Thank you.

@whedon
Copy link
Author

whedon commented Jul 13, 2021

👋 @sylvaticus, please update us on how your review is going (this is an automated reminder).

@whedon
Copy link
Author

whedon commented Jul 13, 2021

👋 @brunaw, please update us on how your review is going (this is an automated reminder).

@sylvaticus
Copy link

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Jul 13, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@sylvaticus
Copy link

Hello, thank you for your quick reaction. I tested the changes you made on the readme, online documentation, JOSS paper and the software and that I believe have improved the software and its user experience.

The two main points (scope of the software and methods integration) that I have highlighted have been addressed respectively in the readme page and in the new "high-level" functions.

There are still a couple of very minor errors present that I may have been bad to communicate in my review (English is not my mother tongue):

  • electre function documentation: the output is copy/pasted from the topsis function above, but it has not been adapted. The electre documentation present the output as a "TopsisResult object" while it is a "ElectreResult" object
  • The first parameter of AHP function is an array of DataFrames, but in the documentation it is presented as a single DataFrame
  • I appreciated the added documentation in the damatel function, but still I don't get why you use n x m matrix... it seems to me that the input matrix of the function should be squared, so n x n ??

There remains some of the "nice to have" things that I have suggested, but that of course do not preclude in my opinion the publication of the JOSS paper, like the coverage, the documentation divided by releases and a documentation that accompany each releases. I have noticed indeed that you don't have releases notes. You can easily add release notes in the same JuliaRegistrator command:

@JuliaRegistrator register

Release notes:

    - bla bla bla
    - bla bla bla

Finally, I didn't understand your reply "Statement of need section is now comparing the implemented package with the cited ones.". This was already the case for the first version of the paper. It is my understanding that the intended section for comparison with existing software would be instead the "State of the field" section, with the "Statement of need" section more describing the problem to which the software provides solutions (more or less what it is now in "State of the field"). However, I don't know if this is strictly enforced at JOSS, or if it is just a recommendation.

@jbytecode
Copy link

Dear reviewer @sylvaticus,

The problem is not your English, I understand exactly what you mean but the update was quite huge and I may have missed some things. I will correct/consider the latest issues that you have addressed above.

Thank you very much for your valuable contributions.

@jbytecode
Copy link

jbytecode commented Jul 14, 2021

@whedon generate pdf

  • electre function documentation: the output is copy/pasted from the topsis function above, but it has not been adapted. The electre documentation present the output as a "TopsisResult object" while it is a "ElectreResult" object
  • The first parameter of AHP function is an array of DataFrames, but in the documentation it is presented as a single DataFrame
  • I appreciated the added documentation in the damatel function, but still I don't get why you use n x m matrix... it seems to me that the input matrix of the function should be squared, so n x n ??
  • The manuscript sections are refactored as recommended above.

@whedon
Copy link
Author

whedon commented Jul 14, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@brunaw
Copy link

brunaw commented Jul 16, 2021

Dear @jbytecode,

Here's my review/comments abou the submission, please let me know if you have any questions about it:

General

  • The summary section seem to have a few disconnected sentences. Could the authors make the text flow better?

  • The State of the field section of the paper needs some changes. The first paragraphs are not a literature review per se, but rather an introduction/explanation of the method. That could probably go in a "Introduction" or "Methods" section, and then the actual LR paragraph could be the "Literature review" section. Does that make sense?

  • The last paragraph of the State of the field needs at least some connections between the sentences. The way it's currently written make it look like just a list of software, with no comments or details, so it's hard to understand why are they relevant to this submission

  • In the "Statement of need" section, there is very little written
    about the actual need of the package. My suggestion is that the
    authors complement that.

  • In the "Community guidelines", what does:

"Let the community discuss the new contribution"

mean in practice? Could the authors please specify that in the text? I understand the idea but the way that's written seem a bit confusing to whomever might want to contribute

  • There are no guidelines on error reporting/seeking support, which could be added to the Community guidelines

Package content/documentation

  • I haven't found many usage examples for the functions (besides in the documentation). Do the authors plan to add that, with e.g. a package manual?
  • There is no list of dependencies explicitly written for the users to see
  • It would be nice to say why you are also loading DataFrames in the code example

Package code

  • The package installs locally with no issues, but the precompilation takes a while (at least on my computer). Is that normal?

Bruna

@jbytecode
Copy link

jbytecode commented Jul 17, 2021

@whedon generate pdf

Dear reviewer @brunaw,

Firstly, thank you for your valuable comments. Thanks to your suggestions, the new version of the package and the paper is now better. Please let me know if some of the listed issues are not complete or if I unluckily misunderstood something. Here is a list of my changes:

Dear @jbytecode,

Here's my review/comments abou the submission, please let me know if you have any questions about it:

General

  • The summary section seem to have a few disconnected sentences. Could the authors make the text flow better?
  • The summary section is refactored.
  • The State of the field section of the paper needs some changes. The first paragraphs are not a literature review per se, but rather an introduction/explanation of the method. That could probably go in a "Introduction" or "Methods" section, and then the actual LR paragraph could be the "Literature review" section. Does that make sense?
  • State of the field is divided into an Introduction and a State of the field section. The former basically describes the problem whereas the latter is focused on the current developments on the subject.
  • The last paragraph of the State of the field needs at least some connections between the sentences. The way it's currently written make it look like just a list of software, with no comments or details, so it's hard to understand why are they relevant to this submission
  • The section is refactored.
  • In the "Statement of need" section, there is very little written
    about the actual need of the package. My suggestion is that the
    authors complement that.
  • This issue is now more clear.
  • In the "Community guidelines", what does:

"Let the community discuss the new contribution"

mean in practice? Could the authors please specify that in the text? I understand the idea but the way that's written seem a bit confusing to whomever might want to contribute

  • There are no guidelines on error reporting/seeking support, which could be added to the Community guidelines
  • Clarified. Added our slack channel for discussions and possible conversations.

Package content/documentation

  • I haven't found many usage examples for the functions (besides in the documentation). Do the authors plan to add that, with e.g. a package manual?
  • Added new documentation (and examples) for single-criterion methods such as minimax, maximin, savage, maximum likelihood, etc. A new version of the package has been pumped to Julia repository. Online documentary has been updated.
  • There is no list of dependencies explicitly written for the users to see
  • Since the Julia package manager automatically downloads and installs the dependencies, a standard user is not interested in this list. However, a list of dependencies added to README.md file.
  • It would be nice to say why you are also loading DataFrames in the code example
  • The package now exports DataFrame, so there is no longer need to require this. The examples in both in paper and repo are updated.

Package code

  • The package installs locally with no issues, but the precompilation takes a while (at least on my computer). Is that normal?
  • Since the package is dependent on the some heavy-weight packages such like JuMP and Cbc, it is normal. This issue is current for many Julia packages just because the Julia first pre-compiles the whole package and its dependencies before the first use. Nothing to do for this (at least as I know).

Bruna

@whedon
Copy link
Author

whedon commented Jul 17, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@whedon
Copy link
Author

whedon commented Sep 28, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@drvinceknight
Copy link

drvinceknight commented Sep 28, 2021

Everything looks good to me. I've tried to find DOIs for the remaining 4 papers myself with no success.

@drvinceknight
Copy link

Could you make a tagged release and archive, and report the version number and archive DOI here.

Please check that the archive deposit has the correct metadata (title and author list): make sure it matches the paper.

@jbytecode
Copy link

@drvinceknight

  • The version is v0.2.4
  • Archive is 10.5281/zenodo.5534663 and here is the link. zenodo link

@drvinceknight
Copy link

@whedon set 10.5281/zenodo.5534663 as archive

@whedon
Copy link
Author

whedon commented Sep 29, 2021

OK. 10.5281/zenodo.5534663 is the archive.

@drvinceknight
Copy link

@whedon set v0.2.4 as version

@whedon
Copy link
Author

whedon commented Sep 29, 2021

OK. v0.2.4 is the version.

@drvinceknight
Copy link

@whedon recommend-accept

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Sep 29, 2021
@whedon
Copy link
Author

whedon commented Sep 29, 2021

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Sep 29, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1109/4235.996017 is OK
- 10.1007/978-1-4939-3094-4 is OK
- 10.1016/j.softx.2019.02.004 is OK
- 10.1109/ieem.2009.5373124 is OK
- 10.1002/9780470400531.eorms0970 is OK
- 10.1016/s2212-5671(14)00342-6 is OK
- 10.1007/978-3-319-06647-9 is OK
- 10.1137/141000671 is OK
- 10.1007/978-3-642-48318-9_3 is OK
- 10.1051/ro/196802v100571 is OK
- 10.1287/mnsc.31.6.647 is OK
- 10.1111/1467-8667.00269 is OK
- 10.1016/0022-2496(77)90033-5 is OK
- 10.1016/s0167-6911(82)80025-x is OK
- 10.1016/0167-9236(89)90037-7 is OK
- 10.1287/opre.2.2.172 is OK
- 10.3846/tede.2010.10 is OK
- 10.5755/j01.eee.122.6.1810 is OK
- 10.15388/informatica.2015.57 is OK
- 10.1016/j.eswa.2014.11.057 is OK
- 10.1016/j.cie.2019.106231 is OK
- 10.13140/2.1.2707.6807 is OK
- 10.1108/md-05-2017-0458 is OK
- 10.1016/0305-0548(94)00059-h is OK
- 10.1002/j.1538-7305.1948.tb01338.x is OK
- 10.1016/0377-2217(78)90138-8 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Sep 29, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2627

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2627, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@danielskatz
Copy link

👋 @jbytecode - I've suggested some small changes in the paper in jbytecode/JMcDM#30 - please merge this, or let me know what you disagree with, then we can proceed to final acceptance and publishing.

@jbytecode
Copy link

@danielskatz thank you for your suggestions. Indeed they are very helpful.
Merged.

@danielskatz
Copy link

@whedon recommend-accept

@whedon
Copy link
Author

whedon commented Sep 29, 2021

Attempting dry run of processing paper acceptance...

@whedon
Copy link
Author

whedon commented Sep 29, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1109/4235.996017 is OK
- 10.1007/978-1-4939-3094-4 is OK
- 10.1016/j.softx.2019.02.004 is OK
- 10.1109/ieem.2009.5373124 is OK
- 10.1002/9780470400531.eorms0970 is OK
- 10.1016/s2212-5671(14)00342-6 is OK
- 10.1007/978-3-319-06647-9 is OK
- 10.1137/141000671 is OK
- 10.1007/978-3-642-48318-9_3 is OK
- 10.1051/ro/196802v100571 is OK
- 10.1287/mnsc.31.6.647 is OK
- 10.1111/1467-8667.00269 is OK
- 10.1016/0022-2496(77)90033-5 is OK
- 10.1016/s0167-6911(82)80025-x is OK
- 10.1016/0167-9236(89)90037-7 is OK
- 10.1287/opre.2.2.172 is OK
- 10.3846/tede.2010.10 is OK
- 10.5755/j01.eee.122.6.1810 is OK
- 10.15388/informatica.2015.57 is OK
- 10.1016/j.eswa.2014.11.057 is OK
- 10.1016/j.cie.2019.106231 is OK
- 10.13140/2.1.2707.6807 is OK
- 10.1108/md-05-2017-0458 is OK
- 10.1016/0305-0548(94)00059-h is OK
- 10.1002/j.1538-7305.1948.tb01338.x is OK
- 10.1016/0377-2217(78)90138-8 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Sep 29, 2021

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2630

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2630, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@danielskatz
Copy link

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Sep 29, 2021

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added accepted published Papers published in JOSS labels Sep 29, 2021
@whedon
Copy link
Author

whedon commented Sep 29, 2021

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Sep 29, 2021

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.03430 joss-papers#2631
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.03430
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@danielskatz
Copy link

Congratulations to @jbytecode (Mehmet Hakan Satman) and co-authors!!

And thanks to @brunaw and @sylvaticus for reviewing, and @drvinceknight for editing!
We couldn't do this without you!

@whedon
Copy link
Author

whedon commented Sep 29, 2021

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.03430/status.svg)](https://doi.org/10.21105/joss.03430)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.03430">
  <img src="https://joss.theoj.org/papers/10.21105/joss.03430/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.03430/status.svg
   :target: https://doi.org/10.21105/joss.03430

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Julia published Papers published in JOSS recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

6 participants