You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have some roles which cannot be made public, but are required in multiple projects. These repositories are accessed via Deploy Keys.
I would like to be able to specify an SSH key in the roles file for ansible-galaxy to use per role (if required), similar to how I can in the git module with the key_file argument.
The text was updated successfully, but these errors were encountered:
My first thought was to use the environment variable GIT_SSH, but you wouldn't be able to set that at a per-role level when installing via requirements.yml.
A work around would be to use a playbook to run the install. The playbook task would use command to run ansible-galaxy, combined with with_items to loop over a data structure containing the role name and the SSH key path.
@chouseknecht Thanks for the quick response, and the possibility of using a a playbook.
It's a work around that we are currently using in one place, however it's unsatisfactory because it isn't something that we expect to do, and is added complexity. We are also using public roles available from Galaxy, and so having two places to install roles from is not the end of the world, but it is undesired.
We have some roles which cannot be made public, but are required in multiple projects. These repositories are accessed via Deploy Keys.
I would like to be able to specify an SSH key in the roles file for ansible-galaxy to use per role (if required), similar to how I can in the git module with the
key_file
argument.The text was updated successfully, but these errors were encountered: