-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to login using local AWS profile with role_arn / source_profile and MFA #5767
Comments
I haven't had time to learn the Vault codebase yet, but I was able to whip up a workaround for this that others might find usable. It's based on the signing code present in the Vault CLI. https://gist.github.com/gwilym/1db446f67a4d62db50d1139082e5b719 The output of this app should usable as part of a
|
Do you have any update ? |
I've been reading this post a few times, whilst trying to solve a different issue, but then I read something that finally clicked - you have used source_profile in ~/.aws/credentials but according to the docs, it can only be used in the CLI config file ~/.aws/config
Example configuration using source_profile:
see https://docs.aws.amazon.com/cli/latest/topic/config-vars.html |
As @kevinpgrant pointed out correctly, that |
We haven't heard back regarding this issue in over 29 days. To try and keep our GitHub issues current, we'll be closing this issue in approximately seven days if we do not hear back regarding this issue. Please let us know if you can still reproduce this issue, and if there is any more information you could share, otherwise we'll be closing this issue. |
Yes this is still an issue, the last comment was from yourself confirming this is still an issue? |
Sorry, didn't mean to put the comment there. Too many open tabs, please ignore it. |
I know this ticket is about MFA, but I am correct in thinking that it also currently isn't possible to use AWS CLI profiles which assume roles? Currently I'm working around this with the aws sts assume-role command and exporting various environment variables from the output of that. Currently doing this:
|
@cablespaghetti I came to the exactly the same conclusion and same workaround. |
@gw0 I actually stopped doing this as it was a horrible user experience. I now have a dockerised bash script running as a Kubernetes cron job which syncs up the members of an IAM Group with Vault, so they can log in as their normal user. |
Still and issue for me. Would be nice to have this fixed. |
I too am receiving this by logging into a central AWS account role that is used to assume role into ather AWS accounts. |
I actually managed to make it work in First of all, I need to mention that I use this approach to simulate ECS agent behavior for the containers I have in This essentially creates a credentials provider server at So inside the container I run this script: if [[ ${IS_LOCAL_ENV} =~ (true) ]]; then
export VAULT_ROLE="<my role ARN>";
rm -rf ~/.aws
mkdir ~/.aws
echo "Running in local env. Assuming IAM Role ${VAULT_ROLE}";
aws sts assume-role \
--role-arn ${VAULT_ROLE} \
--role-session-name docker-compose-local > creds.json;
echo "Setting up AWS credentials..."
echo "[default]" > ~/.aws/credentials
echo "aws_access_key_id = $(cat ~/creds.json | jq -r '.Credentials.AccessKeyId')" >> ~/.aws/credentials
echo "aws_secret_access_key = $(cat ~/creds.json | jq -r '.Credentials.SecretAccessKey')" >> ~/.aws/credentials
echo "aws_session_token = $(cat ~/creds.json | jq -r '.Credentials.SessionToken')" >> ~/.aws/credentials
echo "[profile assumed]" > ~/.aws/config
echo "role_arn = $(cat ~/creds.json | jq -r '.AssumedRoleUser.Arn')" >> ~/.aws/config
echo "source_profile = default" >> ~/.aws/config
export AWS_PROFILE=assumed
else
echo "Running in AWS. Falling back to the associated IAM role.";
fi
echo "Using AWS Profile: ${AWS_PROFILE}"
echo "Running Vault login..."
vault login -method=aws -path=somepath -namespace=somens header_value=someaddress role=read-only Now the problem: HOWEVER if afterward in the same container I would run exactly the same I just cannot understand what causes UPDATE: found the root of the issue - the |
the relevant issue on the aws-sdk-go side: aws/aws-sdk-go#3660 |
This is still an issue when attempting to use named AWS Profiles with the Vault CLI (v1.9.3). I am not entirely certain where the issue resides whether it be AWS or Vault. This fails regardless of MFA configuration. As the above have suggested setting the AWS environment variables is the work around. The following code snippet can be leverage as inspiration which grabs the credentials of the assume role process, sets appropriate environment variables, logs into Vault, reads for Database credentials, unsets the AWS environment variables, and lastly logs into a psql database.
Execution example
Hope this helps! |
This is still an issue. I would like to use my profiles in the AWS config file, and they use I would be especially great if Vault agent could work with this... but I would probably file a new issue for that if it were fixed in the CLI. |
Any news on this issue? I have the same problem but in my case, it is for Terraform. NB: I was able to implement a workaround (inspired by previous examples) but looks pretty bad IMHO: https://gist.github.com/Westixy/bc70ee782fe759094bf5c1c65c248f6c |
This is affecting me as well. I'm surprised to see this issue is 4 years old and we still can't set a |
same issue here |
Here's my version of @Westixy 's script. It doesn't write credentials onto the filesystem. It also assumes that the AWS backend is configured to require the auth header that is set to the URL of Vault. This is the 3rd parameter, which is also the URL. You'll want to remove lines #12 and #29 if this is not applicable. |
I use this workaround to enable the Vault Terraform provider to have a consistent config in environments where an EC2 instance or IRSA role can be used. This method assumes the selected role and stores the AWS credentials in environment variables. To use it, add the following function to your # Usage: vault-aws-auth arn:aws:iam::123456789012:role/MyRole
vault-aws-auth() {
AWS_ROLE_ARN="$1"
unset AWS_ACCESS_KEY_ID
unset AWS_SECRET_ACCESS_KEY
unset AWS_SESSION_TOKEN
export $(printf "AWS_ACCESS_KEY_ID=%s AWS_SECRET_ACCESS_KEY=%s AWS_SESSION_TOKEN=%s" \
$(aws sts assume-role \
--role-arn $AWS_ROLE_ARN \
--role-session-name vault \
--query "Credentials.[AccessKeyId,SecretAccessKey,SessionToken]" \
--output text))
} After running vault login -method=aws header_value=${VAULT_ADDR} For the Vault Terraform provider, provider "vault" {
address = var.vault_addr
auth_login {
path = "auth/aws/login"
method = "aws"
parameters = {
role = var.vault_role
header_value = var.vault_addr
}
}
} |
I can confirm that the local aws profile is not being used at all. I'm doing an AWS Role Anywhere setup which relies on import (
"fmt"
"os"
"github.com/aws/aws-sdk-go/aws/session"
"github.com/aws/aws-sdk-go/service/s3"
)
func main() {
if len(os.Args) < 2 {
fmt.Println("you must specify a bucket")
return
}
sess := session.Must(session.NewSession())
svc := s3.New(sess)
i := 0
err := svc.ListObjectsPages(&s3.ListObjectsInput{
Bucket: &os.Args[1],
}, func(p *s3.ListObjectsOutput, last bool) (shouldContinue bool) {
fmt.Println("Page,", i)
i++
for _, obj := range p.Contents {
fmt.Println("Object:", *obj.Key)
}
return true
})
if err != nil {
fmt.Println("failed to list objects", err)
return
}
} I've tested with python too and in both cases, using default ProviderChain it worked. |
Describe the bug
Attempting to
vault login
using this particular IAM profile setup in~/.aws/credentials
fails with the following error:The same setup works OK with the official
aws
CLI (AWS_PROFILE=admin aws sts get-caller-identity
works), as well as with basic usage of the Go SDK.Example of
~/.aws/credentials
:Note: account IDs above may be the same account, though for this case it likely doesn't matter because Vault fails during the credential-load stage. There's likely two potential issues here, one with credential loading and one with enabling an MFA token provider for the AWS SDK.
To Reproduce
Steps to reproduce the behavior:
vault server -dev
with credentials to utiliseaws
authvault auth enable aws
vault write auth/aws/config/client iam_server_id_header_value=vault.example.com
vault write auth/aws/role/admin auth_type=iam 'bound_iam_principal_arn=arn:aws:sts::SUBACCOUNTID:assumed-role/admin/*' max_ttl=8h
AWS_SDK_LOAD_CONFIG=1 AWS_PROFILE=admin vault login -method=aws header_value=vault.example.com role=admin
Expected behavior
Environment:
vault status
): 1.0.0-beta2vault version
): Vault v1.0.0-beta2 ('8f61c4953620801477ad40f9d75063659acb5d84')Vault server configuration file(s):
None, I've been using
-dev
.Additional context
Apologies up front if I'm missing anything fundamental: I am brand new to Vault. If anything looks off here let me know and I will try to clarify.
When I modify Vault's
awsutil
package to enable verbose errors like so ...... I get the following extra info:
The EC2 errors are expected since I'm running this locally, however, the
admin
profile shouldn't need access keys within it due tosource_profile
. When I take the access keys and put them in the profile, then that prevents the role-switch from happening and it attempts to login using the original credentials instead (which is not expected).Not sure if this helps, but here's an example of a simple, working-as-expected Go AWS SDK usage:
The text was updated successfully, but these errors were encountered: