-
Notifications
You must be signed in to change notification settings - Fork 9.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
s3 remote state still broken for multiple users #13032
Comments
I second this. We currently are stuck on 0.8.8 until this is fixed since we currently use environmental variables in the filename when declaring remote state via the command line 'remote' option. We are also using S3 as our remote backend. |
We create our TF state file's name from environment variables per application/environment. This suggestion worked for me. |
the above solution, #13022 (comment) works for us as well. |
I am going to close this issue due to inactivity. If there is still a question, I recommend the the community forum, where there are far more people available to help. If there is a bug or you would like to make a feature request, please open a new issue and fill out the template. |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further. |
Terraform Version
0.9.1
Affected Resource(s)
s3 remote state backend
Terraform Configuration Files
Debug Output
No debug output necessary
Panic Output
No panic
Expected Behavior
Actual Behavior
Steps to Reproduce
This is a design problem.
Important Factoids
Not everyone can store ALL their AWS credentials within a single [default] filename, nor would everyone want to! Each shell has a specific environmental context that is specific to that shell, and it's malleable for a reason. Multiple accounts/customers/profiles/&c should not bear the risk of collision just because nobody can come up with a better idea than ~/.aws/credentials and no other way to re-define that default file. Or the profile to look up within that file.
Storing static filename references within the remote state specified in S3 is actually wrong. The remote state is already implicitly defined in looking for the remote state in the first place. The state is included in the S3 connection to download that remote state. Further specifying the remote state within the file metadata is at best redundant.
References
I'm sure there are tons; the "AWS credentials in S3 tickets" are all about this problem, just a different facet of it.
My suggestion:
Allow another way to access environment variables explicitly, so you don't have to go through the early graph dance, and you can store those environment variable references within the remote state in S3 Don't give them defaults, don't give them anything. Let it fail if they aren't defined properly, please.
Please don't tear into me too hard for pointing out what someone might not want to hear. This seems terribly obviously broken to me, and it's impacting my workflow and use of this tool, and I'm surprised it's still an issue.
The text was updated successfully, but these errors were encountered: