Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uses local RPM build for "dev" and "staging" scenarios #587

Merged
merged 14 commits into from
Sep 4, 2020

Conversation

conorsch
Copy link
Contributor

@conorsch conorsch commented Jul 10, 2020

Status

Ready for review

Description of Changes

Fixes #538

Changes proposed in this pull request:

  • Using the "make dev" environment now uses RPMs for install/uninstall
  • Docker is now required in the dev env!

Hopefully this will make reasoning about problems like #505 a lot more accessible to developers.

Testing

Requesting testing for multiple scenarios, since early review showed some surprises.

Dev scenario

There's a catch here—you must run make clone twice in order to get started, because the first clone will only pull in the new docker logic, not run it.

make clone
make clone # [sic]
make clean # ensure everything removed cleanly
make dev
make test

Mess around with the clean/clone/dev/test combinations and see if you can get anything to break.

Staging scenario

Now let's check that "staging" scenario continues to work well. Notably the locally built RPM will be installed in staging.

make clone
make clone # [sic, just in case you didn't already run it twice for dev]]
make clean # ensure everything removed cleanly
make staging

Interact, confirm staging-specific functionality like shutdown-on-lid close is working, then run make clean to clean up and confirm no errors.

Prod scenario

Since the prod flow uses a separate setup process (https://workstation.securedrop.org/en/stable/admin/install.html#download-and-install-securedrop-workstation), there shouldn't be any conflicts with the logic presented here. If you can think of some, please comment or directly edit this test plan!

Checklist

If you have made code changes

  • Linter (make flake8) passes in the development environment (this box may
    be left unchecked, as flake8 also runs in CI)

If you have made changes to the provisioning logic

  • All tests (make test) pass in dom0 of a Qubes install

  • This PR adds/removes files, and includes required updates to the packaging
    logic in MANIFEST.in and rpm-build/SPECS/securedrop-workstation-dom0-config.spec

@emkll emkll changed the base branch from master to main July 13, 2020 15:54
@conorsch conorsch force-pushed the 505-use-rpm-for-make-all branch from cf454cf to 7885cc3 Compare July 22, 2020 00:07
@conorsch conorsch requested review from emkll, kushaldas and rmol July 22, 2020 00:11
@conorsch conorsch marked this pull request as ready for review July 22, 2020 00:13
@conorsch
Copy link
Contributor Author

Marking as ready-for-review. Rebased these changes and tested locally, and was pleased with the results, so it's ready for more eyes.

Copy link
Contributor

@emkll emkll left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Went through a preliminary, functional testing worked as expected. A couple of thoughts inline

dom0/sd-dom0-files.sls Show resolved Hide resolved
scripts/clean-salt Outdated Show resolved Hide resolved
# in the subsequent tarball, which is fetched to dom0.
function build-dom0-rpm() {
printf "Building RPM on %s ...\n" "${dev_vm}"
qvm-run -q "$dev_vm" "make -C $dev_dir dom0-rpm"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would it make sense here to bump the RPM version to make sure that the version is always higher than anything available?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not aware of any RPM tooling like dch for programmatically bumping the version. Ideally it'd be something like <current_version><alpha1><git_short_hash>, but absent tooling, simply defaulting to <current_version> makes the most sense to me.

emkll
emkll previously requested changes Aug 3, 2020
Copy link
Contributor

@emkll emkll left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Went through this again, from a development scenario perspective, changes look good and are a significant improvement for developers and brings more consistency across environments.

However it looks like there might be some implications for prod/staging scenarios (see inline). It might make sense to only invoke prep-salt exclusively for the dev scenario (instead of conditionals).

There's also a potential versioning issue: if the version of the local RPM is lower than the one served by the rpm repo, changes to the salt files may be overwritten by the ones on the RPM servers. We can autoupdate but we can also just provide a docs update to unblock, since this will only affect developers.

Finally, tests rely on the create_tarball function https://github.com/freedomofpress/securedrop-workstation/pull/587/files#diff-3c3ec2225864916f85515009ebeeb428L24 . Packaging the tests and reducing/eliminating the requirement of a full clone of the repository in dom0 may help reduce developer error (changes to salt files in the dom0 securedrop-workstation folder will effectively be useless). I don't think this should be immediately addressed as part of this PR, but should probably be tracked in a follow-up issue

UPDATE: we will need to rebase and update references to the installer (securedrop-admin -> sdw-admin since #596 has been merged)

Makefile Show resolved Hide resolved
scripts/prep-salt Outdated Show resolved Hide resolved
@eloquence
Copy link
Member

We should clarify the README as part of this PR, @conorsch if you want I can add a commit to take a stab at it once other comments are addressed.

@eloquence
Copy link
Member

(Needs rebase after #596)

Conor Schaefer added 5 commits August 5, 2020 17:18
Builds the RPM in sd-dev VM, then fetches it to dom0 for local
installation via the make-clone action. Simplifies the provisioning
logic by trusting dnf to handle the add/remove of all config files.

Adjusts the docker build command, removing flag for interactive session.
In order to run the build via qvm-run from dom0, we must remove the flag,
otherwise `docker run` fails with:

  the input device is not a TTY

There are no prompts in the rpm build process.
Now we use the RPM to manage presence/absence of all scripts and
configs, Salt or otherwise. So we don't want to double-remove the files,
we'll trust the "dnf remove" action to handle it, regardless of
environment (dev, staging, or prod).
Adjusts the "make clean" target to reuse the local securedrop-admin
script for provisioning. Added two new cli flags to the script, both off
by default, to accommodate dev-scenario settings: --keep-template-rpm
(to avoid time spent redownloading) and --force (to avoid prompts).
Previously it was emitting helpful text, but still exiting zero, so when
used in chains like "make all && make test" it would try to continue
processing.
@conorsch conorsch force-pushed the 505-use-rpm-for-make-all branch from 7885cc3 to 2a31b21 Compare August 6, 2020 00:20
Pointed out by @emkll during review: the "staging" and "prod" makefile
targets were including "prep-salt", but they really don't need to.
So, removing them, and renaming the target accordingly.
@conorsch
Copy link
Contributor Author

conorsch commented Aug 6, 2020

Rebased to resolve conflicts, and implement a few requested changes. Ready for review!

@eloquence To your point about the README, thank you, I'd welcome fresh eyes on docs updates here.

@conorsch conorsch marked this pull request as draft August 6, 2020 18:09
@conorsch
Copy link
Contributor Author

conorsch commented Aug 6, 2020

Downgraded to draft, to block merge until we've got the docs updated. Still ready for functional review.

scripts/sdw-admin.py Outdated Show resolved Hide resolved
@eloquence
Copy link
Member

Per our discussion at standup, I've added a commit 9dc287a that updates the staging target to use a locally built RPM, just like the dev target. The intended remaining use of this target is during the development of RCs, and to test differences in behavior that are specific to staging (e.g., the power management settings) during development.

That commit also removes the prod target in its entirety, since it serves no obvious purpose at this point. I've tested make staging locally and it seems to work, and I've updated the README, but I haven't updated the test plan yet. @conorsch If this commit looks good to you, I can flesh out the test plan a bit.

While nightly RPMs are currently "temporarily" disabled, I would propose to block merge of this PR until freedomofpress/securedrop-builder#186 is resolved, as the current CircleCI configuration still suggests that we may want to re-enable nightly RPM builds at a later date, which would be unsafe to do in combination with this change.

@emkll emkll changed the title Uses RPM for "dev" scenario Uses local RPM build for "dev" and "staging" scenarios Aug 19, 2020
Copy link
Contributor

@emkll emkll left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @conorsch and @eloquence. I have tested both the staging and dev environments locally, left a few notes inline mostly related to documentation. I think this is otherwise good to squash and promote for final review/merge

README.md Outdated
@@ -192,9 +192,9 @@ In the Qubes Menu, navigate to `System Tools` and click on `Qubes Update`. Click

You can install the staging environment in two ways:

- If you have an up-to-date clone of this repo with a valid configuration in `dom0`, you can use the `make staging` target to provision a staging environment. Prior to provisioning, `make staging` will set your `config.json` environment to `staging`. As part of the provisioning, your package repository configuration will be updated to use the latest test release of the RPM package, and the latest nightlies of the Debian packages.
- If you have an up-to-date clone of this repo with a valid configuration in `dom0`, you can use the `make staging` target to provision a staging environment. Prior to provisioning, `make staging` will set your `config.json` environment to `staging`. As part of the provisioning, a locally built RPM will be installed in dom0, and your package repository configuration will be updated to use the latest test release of the RPM package, and the latest nightlies of the Debian packages (same as `make dev`).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

your package repository configuration will be updated to use the latest test release of the RPM package

This could be somewhat misleading. If the version of the locally built package is equal or higher than the one mirrored by yum-test, the latest test release of the RPM package will not be used (but it will receive test package updates). We should specify to use the prod install procedures to fully mirror the prod flow (or ensure the RPM is build off the latest tag/rc when cloning the repo in the dev VM). (related to comment in https://github.com/freedomofpress/securedrop-workstation/pull/587/files#r458880338)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair point. Rephrased as:

As part of the provisioning, a locally built RPM will be installed in dom0. The dom0 package repository configuration will be updated to install future test-only versions of the RPM package from the https://yum-test.securedrop.org repository, and Workstation VMs will receive the latest nightlies of the Debian packages (same as make dev).

README.md Outdated
@@ -313,7 +313,7 @@ make clone
make dev
```

In the future, we plan on shipping a *SecureDrop Workstation* installer package as an RPM package in `dom0` to automatically update the salt provisioning logic.
The `make clone` command will build a new version of the RPM package that contains the provisioning logic and copy it to `dom0`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should state here that the rpm package is built in the dev env. We should also include a note stating that Docker is now strictly required in the Dev VM

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is stated in the setup instructions for sd-dev:

You must install Docker in that VM in order to build a development environment using the standard workflow.

But I agree a parenthetical here won't hurt as well, will add a commit to that effect.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A bit more detail added in 902fc18


{% else %}

{% if d.environment != "dev" %}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I understand correctly, here we are explicitly excluding dev from installing the RPM to avoid installing the latest from the yum repos should the locally built version be lesser than the ones on the server. if that's the case, it may be worth adding a comment here for future maintainers as it is somewhat counter-intuitive.

sudo qubesctl --show-output state.sls sd-workstation-buster-template
sudo qubesctl --show-output --skip-dom0 --targets sd-workstation-buster-template state.highstate

sd-proxy: prep-salt ## Provisions SD Proxy VM
sd-proxy: prep-dev ## Provisions SD Proxy VM
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changes to the dev environment may have repercussions on these individual makefile targets :

prep-salt makefile target would copy the local salt file in /srv/salt/ whereas this updated prep-dev target will install the dom0 rpm. This means that any changes to the local files in the securedrop-workstation folder in dom0 will not be used.

If this is the case, adding a note to this effect in the dev docs could be helpful

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this is the case, adding a note to this effect in the dev docs could be helpful

That's definitely true. Editing the files in e.g. /srv/salt/ still works fine, but I wouldn't recommend either, given how easy it is to lose changes that way. Will clarify in the docs!

eloquence and others added 2 commits August 19, 2020 09:58
Suggested by @emkll as part of review. Explains:

  * dev changes should be made in sd-dev
  * 'make dev' tries to skip pulling rpm from repos
  * explaining that rpm upgrades in staging are ideal, not guaranteed,
    depending on version number
@conorsch
Copy link
Contributor Author

Added commits addressing requests for further docs clarifications. Updated the test plan to include staging references. Marking ready for final review.

@conorsch conorsch marked this pull request as ready for review August 20, 2020 15:59
@zenmonkeykstop
Copy link
Contributor

Running through test scenarios on a fresh Qubes install (no previous dev env), make clean fails with following errors due to /srv/salt/sd/config.json not being present:

local:
    Data failed to compile:
----------
    Rendering SLS 'base:sd-clean-all' failed: Jinja error: sd/config.json
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/salt/utils/templates.py", line 410, in render_jinja_tmpl
    output = template.render(**decoded_context)
  File "/usr/lib/python2.7/site-packages/jinja2/environment.py", line 989, in render
    return self.environment.handle_exception(exc_info, True)
  File "/usr/lib/python2.7/site-packages/jinja2/environment.py", line 754, in handle_exception
    reraise(exc_type, exc_value, tb)
  File "<template>", line 4, in top-level template code
  File "/usr/lib/python2.7/site-packages/salt/utils/jinja.py", line 145, in get_source
    raise TemplateNotFound(template)
TemplateNotFound: sd/config.json

; line 4

---
# -*- coding: utf-8 -*-
# vim: set syntax=yaml ts=2 sw=2 sts=2 et :

{% import_json "sd/config.json" as d %}    <======================

set-fedora-as-default-dispvm:
  cmd.run:
    - name: qvm-check fedora-31-dvm && qubes-prefs default_dispvm fedora-31-dvm || qubes-prefs default_dispvm ''

[...]
---
DOM0 configuration failed, not continuing
Traceback (most recent call last):
  File "./scripts/sdw-admin.py", line 128, in perform_uninstall
    subprocess.check_call(["sudo", "qubesctl", "state.sls", "sd-clean-all"])
  File "/usr/lib64/python3.5/subprocess.py", line 271, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['sudo', 'qubesctl', 'state.sls', 'sd-clean-all']' returned non-zero exit status 1

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "./scripts/sdw-admin.py", line 187, in <module>
    main()
  File "./scripts/sdw-admin.py", line 177, in main
    perform_uninstall(keep_template_rpm=args.keep_template_rpm)
  File "./scripts/sdw-admin.py", line 146, in perform_uninstall
    raise SDWAdminException("Error during uninstall")
__main__.SDWAdminException: Error during uninstall
Makefile:100: recipe for target 'clean' failed
make: *** [clean] Error 1

@zenmonkeykstop
Copy link
Contributor

copying config.json into place allows make clean to finish, but make dev fails with the following error:

[...]
Installed:
  securedrop-workstation-dom0-config.noarch 0.4.0-1.fc25                                      

Complete!
Copying config secrets into place...
'config.json' -> '/usr/share/securedrop-workstation-dom0-config/config.json'
'sd-journalist.sec' -> '/usr/share/securedrop-workstation-dom0-config/sd-journalist.sec'
WARNING: v2 onion service configuration found.
Support for v2 onion services will be removed from SecureDrop in February 2021.
Migration guide: https://securedrop.org/v2-onion-eol/
make[1]: Leaving directory '/home/kog/securedrop-workstation'
sdw-admin --apply
Applying configuration...
WARNING: v2 onion service configuration found.
Support for v2 onion services will be removed from SecureDrop in February 2021.
Migration guide: https://securedrop.org/v2-onion-eol/
gpg: can't open `/usr/share/securedrop-workstation-dom0-config/sd-journalist.sec'
Traceback (most recent call last):
  File "/usr/bin/sdw-admin", line 187, in <module>
    main()
  File "/usr/bin/sdw-admin", line 161, in main
    validate_config(SCRIPTS_PATH)
  File "/usr/bin/sdw-admin", line 89, in validate_config
    validator = SDWConfigValidator(path)  # noqa: F841
  File "/usr/share/securedrop-workstation-dom0-config/scripts/validate_config.py", line 39, in __init__
    self.confirm_submission_privkey_file()
  File "/usr/share/securedrop-workstation-dom0-config/scripts/validate_config.py", line 106, in confirm_submission_privkey_file
    subprocess.check_call(gpg_cmd, stdout=subprocess.DEVNULL)
  File "/usr/lib64/python3.5/subprocess.py", line 271, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['gpg', '/usr/share/securedrop-workstation-dom0-config/sd-journalist.sec']' returned non-zero exit status 2
Makefile:19: recipe for target 'dev' failed
make: *** [dev] Error 1

@conorsch
Copy link
Contributor Author

conorsch commented Sep 1, 2020

@zenmonkeykstop You're right, there are a bunch of import "sd/config.json" calls in the Salt files. The "import" path evaluates to /srv/salt/sd/config.json, but in other contexts such as the prod docs, we assume that /usr/share/securedrop-workstation-dom0-config/config.json is the proper path. We should definitely fix this, but it looks to my eye like that's a problem on main branch, too.

@zenmonkeykstop
Copy link
Contributor

config.json gets copied to /srv/salt/sd by the make dev/staging runs though, so the issue here IMO is more that make clean doesn't run cleanly when there isn't an existing install or the install broke before that point. If it's run before the RPM is installed then there won't be anything under /usr/share/securedrop... either.

@conorsch
Copy link
Contributor Author

conorsch commented Sep 1, 2020

config.json gets copied to /srv/salt/sd by the make dev/staging runs though

  1. That's no longer true on this branch, which doesn't copy the files to /srv/salt/ anymore.
  2. In prod contexts, we tell admins to configure the secrets in /usr/share/, with no mention of /srv/salt/ at all.

Since Salt state files require access to the json file to import, we should be using the /usr/share/ path in all places as of this PR, since this PR uses the RPM everywhere, even for make-dev. I'll push up a commit making that change.

If it's run before the RPM is installed then there won't be anything under /usr/share/securedrop... either.

Unintuitive, but running 'make clean' will actually install the rpm, then remove everything, because the files for removal are installed by the RPM. That's not new in this PR, that's true even on the main branch now, where "prep-salt" is run as a dependency of make-clean.

The salt states require secrets such as the config file and privkey to
be stored in `/srv/salt/sd/`, as the import lines in the state files
assume those locations. The sdw-admin install action copies those files
into place, but in the case of "make clean", the secrets won't be there.
Let's make sure they are, and make sure to remove them (as on main) via
the uninstall action.
@conorsch
Copy link
Contributor Author

conorsch commented Sep 2, 2020

@zenmonkeykstop The latest commit now supports make-clean even when no prior installation has been run. I did not move the import locations, preferring /usr/share/, but rather re-added the copy-to-srv-salt logic. There are two secrets, the privkey and the config.json file, and the privkey needs a "salt:// prefix (and therefore must be inside /srv/salt/, so it wasn't worth moving just one file. The sdw-admin "install" action copies the files into place in /srv/salt on each run, and both the sdw-admin "uninstall" and the make-clean actions remove them from /srv/salt.

Was still used by sdw-admin, so we must keep it around, even if it's not
useful in the raw Makefile targets.
Copy link
Contributor

@zenmonkeykstop zenmonkeykstop left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test plan for dev and staging is passing for me now. There is one unrelated test failure in make test due to that race condition with systemctl in sd-log, but otherwise this LGTM!

@conorsch
Copy link
Contributor Author

conorsch commented Sep 4, 2020

There is one unrelated test failure in make test due to that race condition with systemctl in sd-log

That'd be #583. Thanks for thorough review, @zenmonkeykstop !

@conorsch conorsch dismissed emkll’s stale review September 4, 2020 19:50

Subsequent review performed by @zenmonkeykstop

@conorsch conorsch merged commit 215108f into main Sep 4, 2020
conorsch pushed a commit that referenced this pull request Sep 11, 2020
Follow up to #587.

The `dnf install -y <local_file>` action will not reinstall a package
if the versions are the same. Since we expect package contents to
change, but not version strings, when running `make clone`, let's make
sure to uninstall the rpm package in dom0 so that the install action
always takes effect.
conorsch pushed a commit that referenced this pull request Sep 11, 2020
Follow up to #587.

The `dnf install -y <local_file>` action will not reinstall a package
if the versions are the same. Since we expect package contents to
change, but not version strings, when running `make clone`, let's make
sure to uninstall the rpm package in dom0 so that the install action
always takes effect.
@conorsch conorsch mentioned this pull request Sep 11, 2020
3 tasks
zenmonkeykstop pushed a commit that referenced this pull request Sep 14, 2020
* Uninstalls RPM in prep-dev script

Follow up to #587.

The `dnf install -y <local_file>` action will not reinstall a package
if the versions are the same. Since we expect package contents to
change, but not version strings, when running `make clone`, let's make
sure to uninstall the rpm package in dom0 so that the install action
always takes effect.
cfm pushed a commit that referenced this pull request Apr 1, 2024
Uses local RPM build for "dev" and "staging" scenarios
@legoktm legoktm deleted the 505-use-rpm-for-make-all branch May 28, 2024 15:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use RPM to manage dom0 files in all environments
4 participants