Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate GitHub Actions CI/CD tests #1840

Merged

Conversation

webknjaz
Copy link
Member

@webknjaz webknjaz commented Sep 1, 2019

Demo: https://github.com/webknjaz/setuptools/runs/209318165 (in fork because it won't show up in upstream w/o merge)

@techtonik
Copy link
Contributor

Does it work offline? Or it is solely for vendor lock-in?

@pganssle
Copy link
Member

pganssle commented Sep 3, 2019

@techtonik That seems a bit of an unfair standard to judge this contribution by, considering none of our other CI providers have configuration files that work offline or are open source.

Also, I may be misunderstanding your concern - are you suggesting that @webknjaz is adding this to try and lock us in to a specific vendor, or that he's adding it to help us avoid vendor lock-in?

@techtonik
Copy link
Contributor

It is possible to Travis CI tests this way - https://stackoverflow.com/questions/21053657/how-to-run-travis-ci-locally and I guess GitLab CI can also be self-hosted, so I just wonder if there is some advancement in defining CI tasks with this new technology.

I'd prefer open source projects promote open source development practices that don't depend on proprietary platforms, but that's just my opinion.

@webknjaz
Copy link
Member Author

webknjaz commented Sep 3, 2019

it is solely for vendor lock-in

GitHub has promised to release an open-source runner (I guess that'll happen after Nov 13) for their workflows which you'd be able to plug into your repos BYON-style.

It is possible to Travis CI tests this way

Those images are out of date by a few years. And Travis CI doesn't use docker anymore. The only way to truly debug in their env today is to email them and request SSH feature.

This contribution is to (1) demo the new thing and (2) probably spread the load between CIs + the result statuses are more precisely shown in PRs. Currently, I'm thinking that it should be quite good for fast linters in particular.
Also, it's useful to test against multiple platforms to hit more corner cases. I've tried a number of projects and in most of them, I've found failing tests (hence windows factor in the matrix commented out). So it's useful to cover more cases.
I'm not advocating to completely replacing existing stuff, just showing additional resources.

It's also useful for doing GitHub-related automations (you can react to any event, post comments back etc.).

@techtonik
Copy link
Contributor

FWIW, Travis CI images are updated regularly - check this out - https://hub.docker.com/u/travisci

@webknjaz
Copy link
Member Author

webknjaz commented Sep 4, 2019

Maybe smth's changed but those aren't VMs they are actually running.

@webknjaz
Copy link
Member Author

webknjaz commented Sep 4, 2019

@pganssle you may have to hit "Enable" in the pop-up @ https://github.com/pypa/setuptools/actions

@techtonik
Copy link
Contributor

Maybe smth's changed but those aren't VMs they are actually running.

Right, containers are not VMs.

@webknjaz
Copy link
Member Author

@pganssle ready to give it a try?

@pganssle
Copy link
Member

I don't really have time to look into this right now, maybe one of the other maintainers can take a look? @pypa/setuptools-developers

I am not 100% clear on whether this adds more CI runs to each PR or if it's running on a schedule, like a cron job or something. I remember last time I tried to hook up travis or something as a github app instead of its built-in version, I didn't really care for the UI.

Do you have any examples of repos that already set up something like this so we can see how it will change the PR workflow?

@jaraco
Copy link
Member

jaraco commented Sep 16, 2019

I'm pleased that the recipe isn't too complicated and is fairly intuitively read. If @webknjaz can assert that this change doesn't affect the PR workflow or at least doesn't block the current PR workflow, then I'm happy to proceed. If it does affect the PR workflow, I'd like to ensure that it doesn't dramatically impact either the reliability (flakiness) of the tests or the total run time. I'd be okay to accept it if @webknjaz could provide a snapshot analysis of the total run time of a PR check run before and after accepting this change.

@webknjaz
Copy link
Member Author

@pganssle

I am not 100% clear on whether this adds more CI runs to each PR or if it's running on a schedule

All 3: PR, push and cron.

Do you have any examples of repos that already set up something like this so we can see how it will change the PR workflow?

Scroll to the Checks section on this page: aio-libs/aiohttp#4056.
As for the workflow: no change. You may opt-in to requiring checks in the branch protection settings, but it doesn't block anything by default. It's just another status report in the UI for you.

@jaraco unless you enforce this, it won't block PRs. Also, GitHub runs up to 20 jobs in parallel and in this demo in my fork you can see that each job takes 3-4 mins to complete: https://github.com/webknjaz/setuptools/runs/209318165. So it's reasonable to expect that all jobs from a workflow would be finished in 4 minutes after pushing the code.

tests:
name: 👷
runs-on: ${{ matrix.os }}
strategy:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's important to note that by default any job failure will cause all other jobs in the workflow to be canceled. If you don't want that, apply this patch:

Suggested change
strategy:
strategy:
fail-fast: false

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've found this useful, especially if there's a network-related timeout in one job.

@webknjaz
Copy link
Member Author

@jaraco is this enough?

@jaraco
Copy link
Member

jaraco commented Oct 2, 2019

Yes, let's give it a go. We can dial it back or roll it back entirely if it causes too much friction.

.github/workflows/python-tests.yml Outdated Show resolved Hide resolved
.github/workflows/python-tests.yml Show resolved Hide resolved
@webknjaz
Copy link
Member Author

@jaraco mind merging then?

@webknjaz
Copy link
Member Author

@jaraco
Copy link
Member

jaraco commented Jan 13, 2020

I'm not sure why I didn't merge this before. Here it goes.

@jaraco jaraco merged commit 71aa332 into pypa:master Jan 13, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants