-
-
Notifications
You must be signed in to change notification settings - Fork 663
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check downloaded files with GPG before using them. #11
Comments
I saw you made a PR for this before, could you please add one here as well? |
Btw, this is not done with packages. |
I think I did not submit a PR for this yet. I will be happy to review your PR 😉 |
@enoch85 Thanks very much for your work. That means a lot to me because this However, I am afraid that you completely missed the point of checking the GPG I refer you to this comment: jchaney/owncloud#12 (comment) As it seems that the public key of NextCloud is currently not available on public key servers, you can do this little trick: gpg --homedir /tmp/gpg --import nextcloud.asc
gpg --homedir /tmp/gpg --export 28806A878AE423A28372792ED75899B9A724937A | gpg --import - to enforce the correct key fingerprint of the public key. |
Your fix is valid but as I said, does only get you very little compared to if you enforce the fingerprint in your build scripts, that will give you (and all users) a whole lot more. In my opinion both the NextCloud documentation and the ownCloud documentation about checking GPG signatures are a bit unmindful and need to be fixed. But note that setting up NextCloud or ownCloud one time as a typical admin vs. writing scripts to build a template VM has a different threat model: https://www.qubes-os.org/news/2016/05/30/build-security/ Refer to On Digital Signatures and Key Verification to learn why your current approach does not fully resolve this issue.
Sysadmins and devs must know that. Let paste in my linked example:
In the case of NextCloud, the public key and the archives are hosted on different servers from what it appears but that is nothing that will stop a skilled adversary. The gain in security enforcing the fingerprint in the build script will get us is that even when BTW, thanks for referring to your implementation again: echo "$?"
if [[ $? > 0 ]]
then
echo "Package NOT OK! Installation is aborted..."
exit 1
else
echo "Package OK!"
fi
Disclaimer: I don’t want to be offensive in any way. I just care about security. Please reconsider. |
Thank you for your long post. How would the GPG key be useful after the build is done? I don't see any usecase for it. Even you deleted the whole folder in you code. Regarding the code, that's already fixed, but not merged. Just used it in testing purposes. ;) Btw, this is open source - instead of writing about it, you can actually make a PR. It's an easy fix. :D In the same amount of time you've spent on writing that post, you could simply just write the piece of code instead. Now it seems like you just want to show off. |
I did/do that often. The problem was when the maintainers don’t understand what the purpose of it is it might get broken. Seen that happen. So excuse me for that.
I guess NextCloud updates happen via NextCloud’s update mechanism? If not, a new archive might need to be downloaded again. Anyway, I think it makes sense to ship the GPG key with the VM. |
I guess that's just a flag that need to be set?
They don't use package managers like apt-get, so every new version has a new key. |
I have not found one yet. If you do let me know 😉
You sure? The whole purpose of a public key is that people can use it to verify signed releases. |
@enoch85 Can you maybe reopen? |
If someone else wants to work on this, sure. |
Closes: nextcloud#11 Follows good example from: https://github.com/nextcloud/docker/blob/master/Dockerfile Related to: nextcloud/server#755
I just had a look at nextcloud_update.sh and it does not check OpenPGP signatures as the installer does. I then greped over the repo to find other cases of wget which potentially also don’t yet do this:
Can you reopen the issue? To solve this, I would propose to drop the single wget’s and do a full git clone of this repo, checking the signature and then using it. |
@ypid Wow, you mean rewrite everything, again? ;) We are just about to finish the refactoring of the current code base. Feel free to open a PR and fix this when that branch is merged to master. |
Not everything. Basically just the download part. 😉
This is too much work for me as I don’t actually use the scripts and I already fixed one instance in #19. Would be great if people who use it to fix this. |
I'll look into this for the updater, but the rest of the code is long term. |
That sounds like a really good plan. Why not doing it this way in all scripts? (use the local script if available) The best way imo would be letting the user the choice: so you could do a signed release and the user just could download this signed release with all neded scripts inside. If the user doesn't want the highest security but stability (bugfixes) and newest features, he could choose during the startup to remove all local files and use the repo-files instead. The other way around: If you have chosen security, only the local files get used. But you could just remove a file afterwards and you would get automatically the latest script from the repo. (or just substitute a local script one time manually) So a bit shorter:
How sound that? |
More work, less time, more confusing. IMO, either we do a signed release or not. As you say, making a signed release would mean pre-downloading all the scripts from each release, and be version specific. Also more work, hence this ticket is old. :) |
Actually, I would like to help you coding that option (using all scripts locally) but not if it is a signed release only and without the ability to always get the latest scripts from the repo. |
@szaimen Please make a POC, I don't make any promises though. It would be really great to be able to finally close this ticket once and for all! |
Then a short question from my side: is it enough to download all scripts during the install_production script (and somehow sign the finished files) or do we have to actually sign the released code, extract the code manually once to the right directory and use the local files then? (Also already for the install_production script?) |
@szaimen Please read the issue and all the links and get your own picture. Sorry, but I don't have time to think right now. Very stressed out. When done, please do what you think would be best - make the PR in question and we can discuss there. Thanks! |
Okay, I will think through it and make a POC. But as always: it will take its time. |
we should add a script that fetches the latest `server_configuration` script @szaimen That way we don't need to keep it in the scripts folder, and it would always be the latest version of the script
If somebody wants to test the latest implementation of this feature, please check out #1263 (comment) |
I actually don't think this will ever happen. We use GPG inside the scripts were we can, and the user can verify against SHA256 mdsum before downloading the VM. Taking it a step further to actually sign each Github release is not on my roadmap. Sorry. This has been stale for several years now, I've kept it open if someone wanted to implement it, @szaimen did a nice try but it never led to something. @ypid If you really want this, please contribute by making a PR. I'll keep this open for few more days, then I'm closing if nothing happens. Thanks for your efforts! |
Thanks for the attempts so far. I cannot go down that rabbit whole however of doing it myself unfortunately. See my comment from #1263 (comment):
|
Seems like it's possible but requires an extra step: https://wiki.debian.org/Creating%20signed%20GitHub%20releases |
Refer to:
The text was updated successfully, but these errors were encountered: