Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs] Initial FeatureSet generator documentation #5188

Merged
merged 3 commits into from
Feb 2, 2020

Conversation

jimschubert
Copy link
Member

This adds the initial step toward a searchable feature matrix. This outputs plain text and markdown output when user includes the --feature-set or --full-details options to the CLI config-help option.

Example (CLI):

image

Example (Web/Markdown):

image

cc @OpenAPITools/generator-core-team and @spacether (by earlier request)

PR checklist

  • Read the contribution guidelines.
  • If contributing template-only or documentation-only changes which will change sample output, build the project before.
  • Run the shell script(s) under ./bin/ (or Windows batch scripts under.\bin\windows) to update Petstore samples related to your fix. This is important, as CI jobs will verify all generator outputs of your HEAD commit, and these must match the expectations made by your contribution. You only need to run ./bin/{LANG}-petstore.sh, ./bin/openapi3/{LANG}-petstore.sh if updating the code or mustache templates for a language ({LANG}) (e.g. php, ruby, python, etc).
  • File the PR against the correct branch: master, 4.3.x, 5.0.x. Default: master.
  • Copy the technical committee to review the pull request if your PR is targeting a particular programming language.

@jimschubert jimschubert added this to the 4.3.0 milestone Feb 2, 2020
@spacether
Copy link
Contributor

spacether commented Feb 2, 2020

Could we add a field to Feature that lets us point to a sample test or generated function?
or maybe map containing

  • spec link
  • generated code link
  • test link

@jimschubert
Copy link
Member Author

@spacether I think the complexity around doing something like that would outweigh the added value. It would require that our codebase (the Feature metadata) becomes knowledgeable about test structures and GitHub repo structures, that tests follow a strict convention, and/or that these docs won't be able to be maintained at the implementation site and auto-generated as they are now. I think we can also assume that users have access to the OAI Specification documents which are a single page each (and 3.x has multiple revisions), since a user would have an OpenAPI document before finding our tooling to consume that document.

The way our documentation works is that we generate markdown, and our docsite reads this markdown when it is compiled and generates a static HTML site from that. This means that any links in these markdown files would have to be absolute links. Consider how that would affect documentation between releases. Currently, users can go to the 4.2.0 tag and browse documentation cleanly and consistently from there (see example). If these docs were to include hard-coded links out to GitHub, it would mean we would either need to pin the link to a version or SHA1 hash (not easy to do) or that we would have to link to master (which would break all documentation from before the previous release).

Aside from that, the samples we generate for any generator and/or library are only one permutation of countless possible outputs… I think linking directly to something that suggests "this is what your code will look like" may be overly confusing.

We do have plans to write tests which will allow generator maintainers to create evidence based tests according to features, and I think that will be discoverable enough on GitHub for those interested in the implementation details of the feature set. However, these will be bound to a template model and not the template itself. To auto-generate those from templates would mean standardizing template structures across all generators, and that may make some generators less maintainable (see some examples: MySQL, Bash, HTML, Finch, Scalaz, protobuf, openapi-yaml). It may also add complexity to other generators which are organized in such a way as to reduce complexity or enable reuse (see some examples: csharp-netcore, aspnet, kotlin, java and its libraries). Restructuring templates across all generators is not only a significant effort, but would also mean invariably breaking existing users who have heavily customized templates.

@jimschubert
Copy link
Member Author

partially related to #503

@jimschubert
Copy link
Member Author

adds the feature option to config-help as requested in #1811

@jimschubert jimschubert merged commit a496c20 into OpenAPITools:master Feb 2, 2020
@jimschubert jimschubert deleted the feature-matrix branch February 2, 2020 01:51
@spacether
Copy link
Contributor

spacether commented Feb 2, 2020

@jimschubert your points make a lot of sense.
To me though the template testing plan that you lay out sounds more complicated than linking to examples a specific commit.

How would developers write those evidence based tests?
In a new spec? As stand alone java generator tests?
Non java generators need to output many/all files for the generator code to be used (like for python packages). So the ability to write tests usually requires that a spec produces a sample client.
For example in python, endpoint tests need a configuration object, an input data model and an api client when calling a post endpoint. A java test of FromOperation is not so useful to python client users.

One way that we could make hard links to commits work is:

  • require that a spec example, generated sample code and test were submitted in an earlier commit
  • example links in Features must have a sha1 hash, this could be enforced with a git hook
  • a user adds that hard link in a PR updating a Feature with examples

You also mentioned that you think users will find this confusing. I feel differently. If a feature is listed and some examples are given then for me it helps me understand how to use that feature. My intention would have the examples be an optional addition, not a required one, so if generator maintainers chose not to add examples that's fine too.

Surfacing that a feature is supported without the option of showing it being used or working opens the door to many users asking where/prove it/give me an example as new issue tickets. We already have 1,100+ issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants