https://git-scm.com/book/en/v2/Git-Basics-Viewing-the-Commit-History https://medium.com/@patrickporto/4-branching-workflows-for-git-30d0aaee7bf https://stackoverflow.com/questions/38831301/how-to-un-fork-the-github-repository
long list of git
- this needs to be updated
- actions to v3
- probably redo this entire file
- categorize all below
- anything under this line i wouldnt trust
- find which file has the git flow strategies and put in here
- review the git town plugin, lol forgot this even existed
sparse clone an existing repo from git to local
git clone --filter=blob:none --no-checkout git/url/to/clone
setup empty dir to later sparse checkout only certain dirs
git sparse-checkout init --cone
cd into the ABOVE dir to init it
git sparse-checkout set paths/to/download
check paths included in sparse-checkout
git sparse-checkout list
only checkout files in root dir
$ git clone --filter=blob:none --sparse <https://github.com/derrickstolee/sparse-checkout-example>
force checking out paths ignoring sparse checkout
e.g. to force checking out a path not matching sparse settings
git checkout --ignore-skip-worktree-bits -- PATHS
<https://stackoverflow.com/questions/6127328/how-can-i-delete-all-git-branches-which-have-been-merged>
git branch --merged | egrep -v "(^\*|dev)" | xargs git branch -d
git remote prune origin
- AAA git cheatsheet
- atlassian git tuts
- managing merge conflicts
- workflow: comparisons
- workflow: feature branch
- semantic release docs
- conventional commits
- AAA git book
- AAA git first time setup
- AAA git intro
- config
- environment vars
- version control intro
- git workflow comparison
- installing git
- issues with git flow
- actions: automated builds and tests
- actions: basic overview
- actions: creating actions + syntax
- actions: creating actions with docker
- actions: features overview
- actions: finding and customizing github actions
- actions: intro
- actions: nodejs
- actions: security hardening
- actions: syntax (must read)
- actions: using dbs and service containers
- artifacts: peristing data
- cache: intro
- dependabot
- jobs: intro
- jobs: intro
- jobs: using a build matrix
- runners: all hosted runners
- runners: hardware resources
- runners: self hosted
- runners: self hosting
- runners: self-hosted labels
- secrets: intro
- variables: context vars
- variables: expressions
- variables: intro
- variables: must read
- varaibles: default vars
- workflows: event triggers
- workflows: intro
- workflows: reusing
- workflows: starter kits
- github starter deployment workflows
- about cd with github
- run terraform github action
- connect to AWS via github action
- security hardening with openid connect
- deploying with github actions
- using environments for deployment
- from aws perspective
- press release
- copying an existing project
- providing feedback
- planning and tracking with projects
- managing project templates
- automation: builtin
- automation: api
- automation: via github actions
- tasklists: intro
- view: layouts
- fields: intro
- fields: milestones
- fields: labels
- fields: iteration
- git: i'm always talking about git-scm.com
- three states of git
- working directory: uncommited changes
- single checkout of one version of the project; pulled out of the compressed database (the .git directory) and place on disk for you to use/modify
- staging area: changes you've commited
- stores information about what will go into your next commit (the index)
- .git directory: changes pushed to github
- where git stores the metadata and object database for your project
- what is copied when you
clone
a repository from another computer
- working directory: uncommited changes
- high level
- snapshots, not differences: other git systems (e.g. subversion) store data as a set of diffs, git stores data as snapshots
- nearly every operation is local: many other git systems require network connecion, however git almost doesnt
- since you have the entire history of the project on your local disk, most operations seem instantaneous
- integrity: everything in git is checksummed before it is stored; and then only referred to by that checksum
- i.e. its impossible to change the contents of a file/directory without git knowing about it
# @see https://www.conventionalcommits.org
## type(scope): description \n bodyMsg \n footerMsg
Add(whatever): hello world
this is an example
Create
Refactor
Fix
Release
Document
Modify
Update
Remove
Feat
Delete etc...
## body...
Feat(api)!: breaking change
Feat(api): regular feat related to api
Feat: regular feat
# delete a branch locally & remote
git branch -d BRANCH_NAME
git push origin --delete BRANCH_NAME
# reset SOMEBRANCH to whatever is upstream
git fetch SOMEBRANCH
git reset --hard origin/SOMEBRANCH
git reset --soft HEAD^
# other options
git reset --soft HEAD^ # undo commits, but leave staged
git reset HEAD^ # undo commits & staged, but leave work tree
git reset --hard HEAD^ # undo everthing
# other stuff
git config --list
git config --list --show-origin
git config --global -e # edit global config in default editor
# rebasing
git rebase -i shaOfFirstCommitToRewrite^ # use this and move on
git rebase -i shaOfLastGoodCOmmitButNotINclude
git rebase -i HEAD~n
n === # of commits to rewrite
# commit diff between two branches
git log --oneline --graph --decorate --abbrev-commit master..develop
git status -s # short status
# [staging][workingtree] FILENAME
# ?? somefile # untracked
# A somfile # staged
# M somefile # modified in working directory but not yet staged
# M somefile # modified and staged
# MM somefile # modified, staged, then modified again
git diff # everything unstaged (not added)
git diff --staged # everything added, but not staged (commited)
git commit -a -m 'ur msg' # but be sure you want to add all changed files
git rm --cached dont/track/this/file/and/remove/from/staging
git mv prevname newname # better than doing a linux mv
# debugging
git ls-files # information about files in the index and working tree
git cat-file # content/type+size info about repository objects
-p HEAD:file_or_directory_path
git log -n 5 #show the recent 5 commits
git log --since=2016-01-15 #show commits since january 15 2016
git log --author="noahehall" #all commits by noahehall
# managing remotes
git clone <url> <newname>
git remote -v # check where git push will send the files
git remote rm origin # disconnect your local dir from the remote repo, e.g. if 4. your changing the remote url
git remote add origin <url> # add a remote repo to your local dir
- types
- system:
/etc/gitconfig
for every user on the system and all repositories - user:
~/.gitconfig/
|~/.config/git/config
for a specific user on the system- this is whats modified when using the
--global
option
- this is whats modified when using the
- local:
[somerepo]/.git/config
specific to a repository- this is whats modified when using the
--local
option
- this is whats modified when using the
- system:
# install
sudo apt install git-all
sudo apt install install-info # for debian (e.g. ubuntu), only if installed from source
git config user.name # see what your username is
git config --show-origin user.name # see where the value for user.name is coming from
git config --global user.name "poop"
git config --global user.email "[email protected]" # always use the noreply, thank me later
# ^ set your editor
git config --global core.editor nano # or e.g. vscode
# ^ default branch for new repositories
git config --global init.defaultBranch develop
#
# gpg signature verification for your user
# in github webconsole, your email will be [email protected]
# however in gitconfig, you cant use SOMEID, so use [email protected] in gpg
# as well as in git config --global user.email "[email protected]
# make sure you sign commits, `git commit -S -m 'poop'
# ^^^^^^
# check for existing keys
gpg --list-secret-keys --keyid-format=long
# genereate new key
gpg --full-generate-key
make sure its 4096 long
see note about private emails above
# retrieve the long format
gpg --list-secret-keys --keyid-format=long
# get ASCII armor format https://docs.github.com/en/authentication/managing-commit-signature-verification/generating-a-new-gpg-key
gpg --armor --export GPG_KEY_ID
# add the above output to your ssh & gpg keys in github console
# go to settings -> SSH & GPG keys -> Add SSH key
# associate the key with your github account
git config --global user.signingkey GPG_KEY_ID
# setup your .gitignore
# Blank lines or lines starting with # are ignored.
# Standard glob patterns work, and will be applied recursively throughout the entire working tree.
# You can start patterns with a forward slash (/) to avoid recursivity.
# You can end patterns with a forward slash (/) to specify a directory.
# You can negate a pattern by starting it with an exclamation point (!).
# An asterisk (*) matches zero or more characters;
# [abc] matches any character inside the brackets (in this case a, b, or c);
# a question mark (?) matches a single character;
# brackets enclosing characters separated by a hyphen ([0-9]) matches any character between them (in this case 0 through 9).
# two asterisks to match nested directories; a/**/z would match a/z, a/b/z, a/b/c/z, and so on.
# examples
# ignore all .a files
*.a
# but do track lib.a, even though you're ignoring .a files above
!lib.a
# only ignore the TODO file in the current directory, not subdir/TODO
/TODO
# ignore all files in any directory named build
build/
# ignore doc/notes.txt, but not doc/server/arch.txt
doc/*.txt
# ignore all .pdf files in the doc/ directory and any of its subdirectories
doc/**/*.pdf
# enforce nano as your editor, merge & difftool
git config --global --unset core.editor # reset something to the default
git config --global diff.tool nano
git config --global core.editor nano
git config --global merge.tool nano
- HEAD: the tip of the source/base branch (e.g. feature)
- theres: the target/base branch, e.g. develop
- strategies
- merge commit strategy:
- merge commit created in source branch, but not target branch
- you can then test the source branch
- revert merge commit if necessary
- push to target branch if valid
- merge commit strategy:
git checkout inThisBranch
git rebase addTheseChanges
<<<<<<< HEAD
head/base/source changes
=======
target/other changes
>>>>>>>
- organizations have project templates
- else you can copy and existing project
- tickets
- add existing issues/PR to a project by adding an item and pasting in the URL instead of a name
- views
- the view type (board, table, roadmap) will determine what options are available in the view title dropdown
- table: fields, group by, sort by
- board: fields, column by, sort by, field sum
- the columns are determined by issue status field
- roadmap: group by, markers, sort by, dates, zoom level
- change settings dynamically, and click
discard
to go back to the saved display configuration
- the view type (board, table, roadmap) will determine what options are available in the view title dropdown
- an adaptable spreadsheet, task-board, and road map that integrates with your issues and pull requests on GitHub to help you plan and track your work effectively
- customize multiple views by filtering, sorting, grouping your issues and pull requests, visualize work with configurable charts, and add custom fields to track metadata specific to your team
- built from the issues and pull requests you add, creating direct references between your project and your work
- in various places you can use expressions, e.g. filters
## can use operators
# >, >=, <, <=, and ..
# filters without a `fieldName:` apply to text fields and item titles
# number filters can use all the operators, `someNumField:10..20` between 10 and 20
# select fields use `fieldName:optX,optY`
# notice there is no = just use fieldName:fieldValue
# thing is an issue, and the associate PR is labeled bug
is:issue,pr label:bug
# status === deployed
status:deployed
# status !== deployed
-status:deployed
# iterations after this one
iteration:>"my iteration label"
# can use @current, @previous, or @next
iteration:@current
- abc
- abc
- add metadata to your issues, pull requests, and draft issues and build a richer view of item attributes
- iteration: plan upcoming work and group items
- labeled, repeating blocks of times
- when you create an iteration field, 3 iterations are automatically created
- edit the field to add/delete/rename labels
- each labeled iteration can have different lengths
- breaks are automatically inserted between sparse labels
- milestones: prioritize and track progress on groups of tickets, provides an overview of all child tickets
- A user-provided description of the milestone, which can include information like a project overview, relevant teams, and projected due dates
- The milestone's due date
- The milestone's completion percentage
- The number of open and closed issues and pull requests associated with the milestone
- A list of the open and closed issues and pull requests associated with the milestone
- create tickets directly in the milestone
- while these are available on the issue, these are per repository
- labels: classify tickets per repository
- organizations can manage teh default labels for repos within the organization
- default labels
- good first issue: are automatically included n the repos contribution page
- labels: classify tickets per repository
- builtin automation: changes to ticket (issue/pr) state automates ticket status
- item added to project
- item reopened
- item closed
- code changes requested
- code review approved
- pull request merged
- auto-archive items: filter runs every 12 hours against your project to archive matching tickets
- auto-add to project: create a filter that matches issues/prs across repos to auto-add to a project
- continue:
- workflow: triggered in response to an event; a configurable automated process that will run one/more jobs
- jobs: one/more tasks that make up a workflow; each run inside a runner (a VM/container), executed sequentially/parallel
- a job will execute all its steps on a single runner
- by default jobs are isolated, but you can force dependencies, e.g. to share a build job with a deploy job
- steps: scripts/actions that make up a job
- executed in the order they appear
- are dependent on each other
- share the VM (and the data)
- action: reusable script to help simplify workflows
- event: a specific activity that triggers a workflow run
- runner: a server that runs your workflows
- each runner can run a single job at a time
- artifacts: files generated in a uses/run cmd that can be shared across jobs in the same workflow
- all run/uses cmds have write access to that workflows artifacts
- secrets: stored in Github as secrets, then referrenced in your ci yml file
- see finding and customizing actions link
- action sources
- in your repo
- in any public repo
- a published docker container image on docker hub (w00p w00p)
- use
needs
to create a dependency between jobs, dependent jobs run sequentially- all dependent jobs are skipped if the
needs
job(s) fails
- all dependent jobs are skipped if the
- by defualt, env vars are scoped to the run/uses block that define them
- enable you to share generated files with other jobs in the same workflow
- see yml below
- once the cache is created, it is available to all workflows in the same repository
- dont store any sensitive info in the cache of public repos
- especially cmdline programs like
docker login
which store creds in a config file - anyone with read access can create a pull request and access the contents of the cache
- even with forks by making a pull request to the base branch
- especially cmdline programs like
- cache vs artifacts
- cache: reuse files that dont change often between jobs
- artifacts: save files produced by a job to view after a workflow has ended
- access caches
- a workflow can access and restore a cache:
- in the current branch
- the base branch (including base branches of forked repos)
- the default branch
- cache isolation exists between different branches
- a cache created for branch POOP with the base develop
- ^ is not accessible in branch FLUSH with the base develop
- a workflow can access and restore a cache:
- caching logic
- specified with
on: ...
- a single event, any of a list of events, or time schedule
- if a list of events are provided, your workflow could execute multiple times
- use
on.event_name.type
to restrict a specific event to a certain type, e.g. issue_comment > created- specifying multiple types could cause multiple workflow runs
- use filters to further restrict events, e.g. branches event should specify which branch
- common events: if any are raised, the workflow will run
- push, fork, pull_request, pull_request_target
- label, issue_comment, issues, milestone
- page_build, project, project_card, project_column
- use project.create to setup racexp
- create, delete (branch/tag)
- deployment, deployment_status
- common types: if any are true the workflow will run
- created, edited, deleted, opened, labeled
- common filters: if all are true, the workflow will run
- branches, branches-ignore: match against
refs/heads
- tags, tags-ignore: match against
refs/tags
- paths
- all usually accept something like
!dontincludethisbranchorfile**
|includethis
* | ** | + | ? | !
- branches, branches-ignore: match against
- schedule syntax:
schedule: \nt cron: 'your cron here'
- read the docs on this one
- workflow_call: define inputs and outputs for reusable workflows
- from github registry:
uses: docker://gcr.io/cloud-builders/gradle
- from docker hub:
uses: docker://alpine:3.8
- are unmasked and shouldnt be used for anything sensitive
- limited to 48kb per var and 25kb per workflow run
- can have 1000 per org, 500 per repo, 100 per env
- can be configured (repo/org) or custom (defined with env inside a workflow)
- FYI
- you need to check whether
env.blah
orsomecontext.blah
is more appropriate- depends on the event, e.g. push vs pull_request
- you need to check whether
- are masked
- syntax
${{ any bash here }}
- literals: null, true/false, number, float, string
- operators:
- grouping ()
- array and object axor: [] | .
- comparisons: ! < > <== ==> == != && ||
- functions:
- Null(), Boolean(), Number(), Array(), Object
- contains(doesThis, containThis)
- startsWith(doesThisStart, withThis)
- endsWith()
- format('this {0} {1}', 'with', 'this')
- join(thisArray, ', ')
- toJSON(prettyPrint)
- hashFiles(thisPath)
- conditionals: automatically parsed as expressions,
${{}}
isnt needed- if:
- cannot directly reference secrets
- instead set secrets as job-level env vars and if the env vars
- available status checks
- success() true if no previous steps failed/canceled
- always() ignores status of previous steps
- canceled() if any previous step
- failure() if any previous step
- if:
- info about workflow runs, vars, runner environments, jobs and steps
- are referenced using the expression syntax
- env: reference custom vars defined in the workflow
- github: workflow run and the event that triggerred the run
- vars: reference a configured (repo/org) variable
# for the full syntax @see https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions
name: some-workflow-name
run-name: some name for this specific run
defaults: # can also be scoped to a specific job
run:
shell: bash
working-directory: "."
env: # can also be scoped to a specific job/step
myvar: "some val"
concurrency: # ensures only a single job/workflow executes at a time
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
on: # @see https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows
some_event:
types: [this_thing, or_this_thing]
some_other_event:
when_these_are_true:
- this_thing
- or_this_thing
jobs:
some-job-name:
# @see dbs and service containers
# container: node:10.18-jessie
# services:
# postgres:
# image: postgres
runs-on: [macos-10.15] # you should prefer matrix
strategy:
fail-fast: true
matrix:
os: [ubuntu-latest]
node: [18, 19]
steps: # each array item runs in the order defined
- name: name this step
run: echo "i belong to name^"
if: ${{ github.event_name == 'pull_request' && github.event.action == 'unassigned' }}
continue-on-error: true
timeout-minutes: 1
- uses: actions/checkout@v3 # always use this to checkout the repos code
if: ${{ failure() }} # if the previous step failed
- uses: actions/setup-node@v2 # theres bunches of these for specific tech stacks
with: # generally a `uses` needs a `with`
node-version: "14" # dizzam its on 19 now
if: ${{ always() }} # will always run, even on failures
- uses: actions/upload-artiact@v3 # upload an artifact: only jobs in the same run can overwrite
path: wherever/poop.log
name: my-artifact
- uses: actions/download-artifact # download a previously uploaded artifact from any workflow
with:
name: my-artifact
- run: npm install -g bats # a cmd, not reusable
env: # are set in the env of run
WOOP: true
- run: "./.github/scripts/poop.sh" # prefer this, so we can reuse them
shell: bash
- name: retrieve a secret
env:
super_secret: ${{ secrets.SUPERSECRET }}
run: | # inline, multiline script
normalbashfn "$super_secret"
- lol what happened here? must be in another file