-
Notifications
You must be signed in to change notification settings - Fork 688
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build ultimate viewing station interface #415
Comments
How does the exchange of files (.securedrop) works between the online and air-gapped tails machines? |
Current securedrop users need to do this already, but I don't think we have a best practice yet. Ideally it would be using read-only media, like burning data to CDs, or using sdcards or USB sticks with hardware enforced write-protect switches (using different sdcard or USB stick for transferring files in each direction). But the lower security method would just be to use a normal USB stick. |
Copying-and-pasting comments from securedrop-dev: Chris Palmer wrote:
@garrettr replied:
|
I've started building the skeleton of this air-gapped user interface here, complete with a system for "installing" it in the Persistence volume: https://github.com/micahflee/securedrop-airgap -- we can move this to freedomofpress/securedrop-airgap at some point. This requires Tails 1.1 because it depends on wheezy pacakges (Tails 1.1 now has a release date of July 22). For now you can use 1.1 beta to test this out/develop in. I've also been thinking about journalist workflow and how to make it simpler. As it stands, journalists already need to "install" software in their networked Tails Persistence volume in order to add the HidServAuth line to torrc on boot. So why not make this software do everything we need, rather than just that? Once there's an air-gapped Tails desktop app that you import data into, all you have to do on the networked Tails is download (and upload) the latest changes. Then you can copy them to (or from) your air-gapped Tails and import them. So how about this idea: There should also be another piece of software called securedrop-client, which runs on the networked Tails computer. Like securedrop-airgap, it's a GUI app. On first run it lets you set your .onion domain, your HidServAuth token, and your username (for basic auth for Google Auth). It can save this info in a config file (like Basically, I don't see any reason to require the journalist to use a web browser, when we can build a simpler, better interface for downloading changes. In order for this to work, the journalist.py web app needs a new endpoint, something like We also need a new library that will be included in both the securedrop project and the securedrop-airgap project that handles creating these updates and saving them to a file, and also loading from a file and parsing back into a data structure. Which means we'll need to come up with a file format, and maybe a file format version (in case it changes in the future). To use the git metaphor above, if we want to call a "commit" an "update", then maybe the python module will be called So to wrap it all up:
|
Also, this is all stuff I'm just dreaming up. As it stands securedrop is really easy to use for sources and still painful for journalists. I think 0.3 will help this a lot, but if we can make securedrop-airgap and securedrop-client it will be "easy" for journalists too, at least compared to what they currently have to do. Please add your own input. This might not be the best approach, but I think it's pretty good. |
Micah with this approach is there a reason to leave a copy of the existing From a journalist usability perspective I think it makes a lot of sense by Also might be a good idea in the airgapped journalist interface to also
|
The reasons I can think of are: what about orgs with multiple journalists that check securedrop that are in different geographic locations? If JournoA downloads and deletes all of the updates from the server, then when JournoB logs in there won't be any updates to download. There also could be the case where it's normally JournoA's job to check securedrop, but for whatever reason they can't do it for a month so JournoB takes over until JournoA is back. And then finally there's the scary bit -- what if a journalist downloads the updates and then doesn't successfully copy them to the air-gap Tails. Like what if they burn the wrong file to CD and then shut down Tails. By the time they boot up Tails again the file is lost and it can't be redownloaded again. Of course this can be mitigated by making securedrop-client always download files to something like
I was thinking securedrop-airgap could be as large and featureful as we want, because we can add as much new information as we want (starring sources, giving sources custom aliases, adding notes to everything -- documents, messages, replies, etc.). When you export changes it will only export updates that are actionable by the server (posting replies, deleting things) and the rest of the day won't get exported. This is safe because it's all on an air-gapped computer, and since it's in Tails it's all sitting in an encrypted persistence volume. But this also has the same problem -- if multiple journalists are using this, each journalist will only see their own extra notes. This part might be fine though -- maybe JournoB doesn't need notes from JournoA's source. But perhaps there can be a way of exporting from securedrop-airgap that also exports all this other stuff, so that file can be directly imported into another securedrop-airgap to keep them all in sync. (And JournoA and JournoB can use onionshare to send those updates to each other, so they don't need to exist on the securedrop server at all.) But ok tying it all together, how about this for a solution: When a journalist uses securedrop-client to download updates, it marks all that updates on the server for deletion a week from now (sort of like the option to delete emails later in POP). This way the journalist (or another journalist) can still download them again, before they get auto-deleted. When exporting data from securedrop-airgap, there can be a normal export and a special checkbox to export all data (sort of like exporting in a PGP keypair in enigmail -- default is just to export public key, but you can check a box to also export secret key). The normal export will only export replies to sources (and perhaps there can be a "delete immediately" option on imported updates that gets exported too). If you check the box, it exports all data, including submissions, replies, notes, labels, etc. This can be used to import into another securedrop-airgap, not to import into securedrop-client. (If you try importing it into securedrop-client, it could silently ignore everything except for replies and only send those to the server.) This way, data doesn't sit on the securedrop server for more than a week after getting downloaded (and can optionally get deleted right away), and it's possible for journalists to send eachother blocks of updates to keep themselves in sync. |
By the way the securedrop-airgap skeleton for a GUI app that's secretly a website is pretty much done: Here's the web page that it's rendering: https://github.com/micahflee/securedrop-airgap/blob/master/securedrop_airgap/templates/index.html And here's the server-side flask web app that's powering it: https://github.com/micahflee/securedrop-airgap/blob/master/securedrop_airgap/webapp.py |
I generally think it's a great idea! I think it would be good to give journalists a way to write replies in a more secure environment and create tags and make notes in a way that never escapes the airgap, unless exported explicitly. I envision this basically making the hidden-service hosted web ui redundant, used only to sync between instances of the securedrop-client apps with the endpoints you've listed. You were mentioning the problem of syncing private notes between airgaps. Since the airgapped machines already have the journalist keypair to decrypt messages, we can use this to sign exports from the airgap (in the case of public exports) and encrypt messages to the other airgap containing the same keypair (in the case of private exports). When you move the exports over to the second airgap, both can be verified and the private export can be decrypted. As we move to a system with multiple recipients, this workflow could be amended to sign with journalist a's private key to journalist b, having their public key available. This would even allow collaboration between journalists, not just sharing notes on a source but possibly even selectively sharing documents and allowing collaboration. I think we'll have to deal with some problems regarding the canonical order of events, though. What if there are two submissions to the This problem only becomes worse when there's no server to make the final call, as is the case with the private exports. Which sequence of events should be staged? And what if the installed version of securedrop-airgap is different on different airgaps, and the way events are imported changes? What if the airgaps get out of sync because of this? I think we'd need to include with each update some kind of checksum of all data in the current state. I don't think these problems are show stoppers but I do think they have to be considered carefully before implementation. All in all though I think this is exciting and we should incorporate it on our roadmap! |
I agree. I think it makes sense to develop this in parallel to the current journalist interface. But once this contains all the functionality of the current journalist interface, we should completely strip out all of that code and just use this instead. I also think you have some really good ideas about signatures. I think all exports should be encrypted and signed. The securedrop web app can have its own keypair. When you download updates, the file you download should be encrypted to the air-gapped key and signed with the web app key. When you export from securedrop-airgap to import into securedrop-client, the file exported should be encrypted to the web app key and signed with the air-gapped key. When you export from securedrop-airgap to import into another securedrop-airgap, it should be both encrypted to and signed with the air-gapped key. Software can refuse to import updates that aren't properly signed. This is especially useful because if removable media, like a burn cd, a usb stick, or an sdcard, with updates on it gets compromised, it's totally worthless without also compromising either the air-gapped key or the web app key (otherwise there would be metadata leakage). Additionally, I think it's actually safe to use TOFU to import the web app key into securedrop-airgap. Each time you download updates using securedrop-client, the web app can include a copy of its public key. If securedrop-airgap doesn't know what the web app key is, it can just store the first web app key that gets imported. From that point on it can refuse to import updates that are signed with a different key. The installation documentation can just say that after installing all of the securedrop components, download and import updates the first time to initialize securedrop-airgap. |
this is now being actively worked on in the qt client repo: https://github.com/freedomofpress/securedrop-client |
I think we should build a GUI program to be installed in the Tails persistent volume that holds the journalist key. Basically, my hope is to move most of the work that journalists do from their internet-connected Tails to their air-gapped viewing station Tails. The journalist workflow can change to this:
net-to-airgap.securedrop
net-to-airgap.securedrop
in app, which will load all of the latest changesairgap-to-net.securedrop
airgap-to-net.securedrop
to post all replies to sourceIf we do this, we get some awesome benefits:
I recently have been playing with pywebkitgtk to build a GUI for onionshare using HTML, CSS, and javascript for the GUI, but still being a standalone python program. Basically, you create a GTK window, put a webkit webview in it, and then render an HTML page. This guide was really helpful.
I think by building it as a web app that runs as a program we will have a lot of flexibility, and can make it work really well and look really polished. Optionally we can also run a web server with flask there to serve the websites and use a sqlite database (in onionshare it just serves a straight html file with no web server).
In terms of downloading changes on the internet computer to move to the air-gap viewing station and visa-versa, we obviously need to only download diffs and not everything each time. There are various ways we could go about doing this, but I think we can think of the current state of a securedrop instance sort of like a git repo, and commits would happen each time a source is created, a doc or message is submitted, a reply is sent, or something is deleted. Each of these events happens chronologically, and they can have unique ids.
The securedrop server can keep track of which "commit" the journalist's air-gap is currently on. When they click the download changes button, it can bundle all "commits" from when they last downloaded until the latest commit, and serve them all together. And a similar thing can happen in the other direction, when the journalist exports from the air-gapped GUI app. If there are multiple journalists accessing the same securedrop, merges can be handled automatically. This stuff can all happen under the hood -- the only thing journalists should need to know is that they need to copy latest changes to their air-gapped computer, do some work, then copy latest changes back to their internet computer.
I think this will be a huge long-term project, like maybe part of the 1.0 roadmap. Thoughts?
The text was updated successfully, but these errors were encountered: