Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

s3 remote state still broken for multiple users #13032

Closed
in4mer opened this issue Mar 24, 2017 · 5 comments
Closed

s3 remote state still broken for multiple users #13032

in4mer opened this issue Mar 24, 2017 · 5 comments

Comments

@in4mer
Copy link

in4mer commented Mar 24, 2017

Terraform Version

0.9.1

Affected Resource(s)

s3 remote state backend

Terraform Configuration Files

No configs necessary

Debug Output

No debug output necessary

Panic Output

No panic

Expected Behavior

  • I expect the s3 remote state plugin to be able to handle an arbitrarily named file that contains the AWS profile information referenced by a reserved environment variable.
  • I expect the s3 remote state plugin to be able to handle an arbitrarily named profile referenced by a reserved environment variable.

Actual Behavior

  • any attempt to use a non-standard filename through the shared_credentials_file variable in the provider does exactly the opposite; upon export of remote state, the shared_credentials_file is hardcoded into the remote state itself, thereby requiring any user of the remote state to have their credentials stored in that remote state/globally defined absolute-pathed filename.
  • interpolation is verboten within the backend config block, even if the variables are provided via TF_VAR_* environment variables. This seems undesirable; that is the entire point of environment variables.

Steps to Reproduce

This is a design problem.

Important Factoids

Not everyone can store ALL their AWS credentials within a single [default] filename, nor would everyone want to! Each shell has a specific environmental context that is specific to that shell, and it's malleable for a reason. Multiple accounts/customers/profiles/&c should not bear the risk of collision just because nobody can come up with a better idea than ~/.aws/credentials and no other way to re-define that default file. Or the profile to look up within that file.

Storing static filename references within the remote state specified in S3 is actually wrong. The remote state is already implicitly defined in looking for the remote state in the first place. The state is included in the S3 connection to download that remote state. Further specifying the remote state within the file metadata is at best redundant.

References

I'm sure there are tons; the "AWS credentials in S3 tickets" are all about this problem, just a different facet of it.

My suggestion:

Allow another way to access environment variables explicitly, so you don't have to go through the early graph dance, and you can store those environment variable references within the remote state in S3 Don't give them defaults, don't give them anything. Let it fail if they aren't defined properly, please.

Please don't tear into me too hard for pointing out what someone might not want to hear. This seems terribly obviously broken to me, and it's impacting my workflow and use of this tool, and I'm surprised it's still an issue.

@bruceharrison1984
Copy link

I second this. We currently are stuck on 0.8.8 until this is fixed since we currently use environmental variables in the filename when declaring remote state via the command line 'remote' option. We are also using S3 as our remote backend.

@joestump
Copy link

We create our TF state file's name from environment variables per application/environment. This suggestion worked for me.

@knope
Copy link

knope commented May 5, 2017

the above solution, #13022 (comment) works for us as well.

@bflad bflad self-assigned this Jun 2, 2020
@bflad bflad added this to the v0.13.0 milestone Jun 2, 2020
@bflad bflad removed their assignment Jun 2, 2020
@bflad bflad removed this from the v0.13.0 milestone Jun 2, 2020
@mildwonkey
Copy link
Contributor

I am going to close this issue due to inactivity.

If there is still a question, I recommend the the community forum, where there are far more people available to help. If there is a bug or you would like to make a feature request, please open a new issue and fill out the template.
Thanks!

@ghost
Copy link

ghost commented Oct 13, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@ghost ghost locked as resolved and limited conversation to collaborators Oct 13, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

8 participants