-
Notifications
You must be signed in to change notification settings - Fork 220
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
node_modules for real? #12
Comments
@chenyong I agree with you 100% However, here is a workaround if node_modules is in your gitignore and you wish to generate node_modules during runtime: on: [push]
jobs:
hello_world_job:
runs-on: ubuntu-latest
name: A job to say hello
steps:
# To use this repository's private action,
# you must check out the repository
- name: Checkout
uses: actions/checkout@v2
- name: Install Packages
run: npm install
- name: Hello world action step
uses: ./ # Uses an action in the root directory
id: hello
with:
who-to-greet: "Mona the Octocat!!!!!"
# Use the output from the `hello` step
- name: Get the output time
run: echo "The time was ${{ steps.hello.outputs.time }}"
The above mentioned code is a remix of the code given here |
bump, any way we can just have the actions runner |
Probably it's a bad practice that the direct content of a repository is the action in such a case. One (not so elegant) solution could be to have one repo for the code that you'll work with and another repo where you deploy a build-result to, that's going to be the action you will use. |
@rachmari, @andymckay, @jorgebg, @thboop - do you have any thoughts about this issue? |
Hi @adamhenson 👋. This repo is a companion to the documentation in this guide: https://docs.github.com/en/actions/creating-actions/creating-a-javascript-action#commit-tag-and-push-your-action-to-github While checking in the node_modules directory is counterintuitive, here is a bit of documentation about why we require you to check in the node_modules directory when creating an action (from the guide 👆):
I've reached out to see if there are any plans to change that in the future. I'll drop a note here if I hear about any plans to change that going forward. |
Thanks @rachmari. If there is a better place to open this issue - please do let me know where. Appreciate the link to the docs but it’s still unclear why @irl-segfault’s suggestion isn’t possible. This issue is impactful in the open source sense in that PRs are quite difficult to manage with the diff from node_modules or even the Vercel option which generates a gigantic diff (and threw errors for me). |
@adamhenson finding an existing discussion or opening a new one here is the best place: https://github.com/github/feedback/discussions/categories/actions-and-packages-feedback
The actions team doesn't currently have any plans to update the runners to be able to perform the npm install for each JavaScript action defined in a workflow. I suspect that could increase workflow time significantly in some cases as well, depending on how many actions you use in a single workflow. Rather than checking in the entire directory of node_modules, another option that the guide refers to is using https://github.com/vercel/ncc to compile all of the node modules into a single file. I'm not sure if that's what you were referring to when you said: "or even the Vercel option which generates a gigantic diff" or not. GitHub uses |
Thanks @rachmari. Yes I tried NCC, but it threw an error in a certain project… so it seems unstable. Anyways, thanks for your responses. It’s not the end of the world… just inconvenient and not ideal especially as an open source project. |
Perhaps Github Actions could support downloading a zip archive of node_modules from Github releases for specific actions. |
I wish that github native dependabot could update the node_modules folder for us every time it also updates our package.json and package.lock.json. |
They tell you
and step 4:
when I remove
|
## Problem A good release process needs to be fast while also ensuring quality and consistency. Confidence of quality comes from incorporating automated testing. And there are many aspects of consistency including: - Ensuring commits are tagged with version number so we can know what is in each release (now and in the future in case of any security reports) - Version number in `pinecone/__version__` is correctly updated, so we don't try to reuse the same one later - Building in a standardized way to avoid introducing accidental inconsistencies (e.g. uncommitted and untested local changes accidentally included in a release, using an old version of python on a local machine, etc) ## Solution Implement github actions workflows for publishing releases (`release.yaml`) and prereleases (`alpha-release.yaml`). <img width="649" alt="Screenshot 2023-06-06 at 3 01 23 PM" src="https://github.com/pinecone-io/pinecone-python-client/assets/1326365/79e2464a-832c-447d-a771-146dc6d60ecf"> These workflows both follow the same high-level steps that take place in a `publish-to-pypi.yaml` [reusable workflow](https://docs.github.com/en/actions/using-workflows/reusing-workflows): - Run tests using another new reusable workflow, `testing.yaml`, which I refactored out of the existing PR workflow. - Automatic version bumps based on type of release (`major`, `minor`, `patch`). - I ended up writing my own small action to do this because I didn't find a nice github action that would do this off the shelf. There are lots of small code packages out there that do the version bump logic but they tended to have weird flaws that made them a poor fit (e.g. wanting to handle the git commit part even though this isn't the right moment for it in my workflow, wanting to store separate version-related config elsewhere, poor support for custom alpha/rc version numbers, etc.) - Check that version numbers are unique (i.e. that there is no preexisting git tag with that number. This mainly protects us from accidentally reusing the same rc number). This validation is possible since git tags become the source of truth on which version numbers are used/available. - Build with existing make task - Upload to PyPI using twine. This is no change, just using Makefile tasks defined by others in the past. - Git push commit with updated `pinecone/__version__` and git tags. Save this step for last so that the git tag is not consumed unless the pypi upload is successful. ## Other notes Updated the version saved in `pinecone/__version__` because I noticed that it is out of date with the latest version published on PyPI. This is one example of a type of error that will no longer occur when publishing from CI. While implementing this I ran into several different issues in github actions including: - actions/runner#2472 - actions/runner#1483 - actions/hello-world-javascript-action#12 ## Future Work Some ideas for future: - I would like to make some automated changelog updates and draft release notes from CI. But not doing this now as this is already a large change. - I could also do some work to figure out what the automatic next rc number is by inspecting what git tags have been used. But for now it's easy to just pass in the values as an input when starting the job. ## Type of Change - [x] Infrastructure change (CI configs, etc) ## Test Plan I used the test environment at test.pypi.org to publish an rc version of this package using the `alpha-release.yaml` version of the workflow. Here's a [test run](https://github.com/pinecone-io/pinecone-python-client/actions/runs/5192266625) I did that targeted the testpypi and resulted in [this 2.2.2rc4 artifact](https://test.pypi.org/project/pinecone-client/2.2.2rc4/). Which can be installed like this by specifying the test.pypi.org index: ``` pip install -i https://test.pypi.org/simple/ pinecone-client==2.2.2rc4 ``` In general, to test a github workflow that only exists in a branch you need to install the github CLI and run a command like this from inside the project: ``` gh workflow run alpha-release.yaml -f ref=jhamon/release-from-ci -f releaseLevel=patch -f prereleaseSuffix='rc4' ``` The only thing I changed after this run was to switch from targeting the test PyPI index to the production PyPI index. The prod workflow is substantially similar so I have a high confidence it should work but I don't want to test it and release until I land this in master because I want to the release commit to tag a SHA in the main history (and not this branch). I plan to land, then make a follow up for any small issues that come up when trying to use it for the first time.
One of the more elegant solutions I've come across for Interestingly, you can use a Docker action to install packages via the Dockerfile before running the action, as the OP was requesting. What's cooler is that, from my testing, JavaScript run inside of a Docker action container can still use the |
Yeah, doing docker is also not acceptable for me as I need my actions to complete as fast as possible. |
Hi All, I submitted a PR to this repo to switch how this is currently handled. Instead of committing |
Unusual but required: actions/hello-world-javascript-action#12 (comment)
I see
node_modules/
included in the project while it's considered bad practice to have it in git repo. Is is the only way to make sure Node runs well in actions with correct dependencies? Can I install the packages during(or before) action is really running?The text was updated successfully, but these errors were encountered: