-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add compatibility suite with selected OpenAPI documents #267
Conversation
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
Signed-off-by: Si Beaumont <[email protected]>
This is great, @simonjbeaumont! Can you also do one run with building the generated code on this CI to compare it to the 7mins of just generating? |
Signed-off-by: Si Beaumont <[email protected]>
This reverts commit 3263b22.
Signed-off-by: Si Beaumont <[email protected]>
This reverts commit b244f6f.
This is probably fine and makes things much easier to debug on failure. @czechboy0 shall we proceed by getting a PR pipeline that does codegen only for all the docs (in sequence, not parallel)? Footnotes |
Yes, let's start with that, and we'll work on the next steps in subsequence PRs. |
@yim-lee please could you configure a pipeline called This PR should be a good candidate to see if it's running. It should take ~10 mins. |
@swift-server-bot test this please |
@simonjbeaumont Added |
This reverts commit c969660.
Thanks @yim-lee! @czechboy0, I've removed the docc-test hijack and this is ready for review. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👏
Motivation
In order to help shape the 1.0 release, we'd like to maintain a public compatibility test suite, with some selected real-world OpenAPI documents. The goal of this is to check that (a) we can run the code generation for the document, and (b) the generated code compiles. We can then use this test as part of CI. However, we may want to gate PRs only on the code generation phase and run the build on main to keep PR feedback timely.
Modifications
CompatibilityTest
to theOpenAPIGeneratorReferenceTests
test target.Result
CI runs can check we are still able to handle selected real-world OpenAPI documents.
Test Plan
I have run this many times locally with varying parameters, and intend to tweak the parameters used in CI during this job. The draft PR hijacks an existing CI pipeline (
docc-test
), which will be reverted and replaced with a new pipeline before merging.TODO
Notes
There are a number of decisions to make about how we run this suite:
The below sections contain some numbers from some local experiments.
Building the generator and compatibility test
-j $(($(nproc)-1))
Conclusion:
-j
has no impact; presumably because it defaults to core count.All following experiments assume no
-j
argument is used when building thegenerator and the compatibility test suite.
LLVM flag when building the generator and compatibility test
-Xllvm -vectorize-slp=false
Conclusion: As expected, this has no impact in debug; but also, not in release.
All following experiments do not use this flag when building the generator and
the compatibility test suite.
Test times (only code generation)
The following table shows the result of running the compatibility test suite in
a mode that only performs code generation, i.e. it does not build the generated
code.
Conclusion: Probably worth parallelizing the test, up to the cores.
Building the compatibility test and running codegen
The following table combines the results from the previous sections.
Conclusion: Given that building the generated code (next section) is unaffected
by whether the generator and test are compiled in release mode we can determine
that the fastest pipeline that runs the codegen is debug, with parallel test.
All further experiments use debug build of the generator and tests.
Test times (including building generated code)
--parallel
Conclusion: using
--parallel
, at the XCTest level,doesn't play too well with the
swift build
, within the test. That is becauseeach of the
swift build
commands in each of the parallel tests willparallelise according to the number of cores. Looking at
top
in thecontainer, we can see that there is an explosion of build processes that
compete with each other and with the outstanding
swift-openapi-generator
thatare still in flight.
Disable parallel build of generated code
This can be resolved by using
swift build -j 1
within the test, which doesallow
--parallel
to provide some speedup:Conclusions
3 minutes, and should:
a. Build the generator and compatibility test suite in debug.
b. Use
--parallel
when running the test.17 minutes, and should, additionally:
c. Use
swift build -j 1
when building the generated code.Potential other directions
It might be worth not using XCTest as the harness for the compatibility suite
and then we can control the parallelism from within the compatibility test
harness itself.
That said, this is probably good enough for now.
Footnotes
Parallel test implies the test was run with:
--parallel --num-workers $(($(nproc)-1))
. ↩This does not include build time; tests are run with
--skip-build
. ↩Note, this does not include build time of the generator or the
compatibility test suite. ↩