Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Describe when features should be limited to secure contexts. #75

Closed
wants to merge 15 commits into from

Conversation

dbaron
Copy link
Member

@dbaron dbaron commented Aug 23, 2017

This pull request is intended to fix #32.

I expect this text to be somewhat controversial and need a good bit of review and polishing. However, it seemed like a good way to start would be to write something down (thanks to @mikewest for the reminder), and we can try to make progress from there.

I'd also note that the place I put the new section in the document didn't feel particularly obvious, and is probably worth thinking about during the review.

I'll try to remember to address feedback as additional commits, with the intention of squashing the later commits in at the end.

@dbaron dbaron self-assigned this Aug 23, 2017
Copy link
Member

@annevk annevk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think an additional factor of "Is it trivial to exclude the feature to secure contexts?" might be useful. Basically if it's easy we should just do it. If it's not easy the other considerations would still apply of course.

index.bs Outdated

When the new feature is defined in
<a href="https://heycam.github.io/webidl/">WebIDL</a>,
specification authors can limit a feature to secure contexts
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can or should?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this one actually meant can; this was a description of facts, not a conformance requirement.

index.bs Outdated
Similar ways of marking features as limited to secure contexts should be added
to other points where the Web platform is extended over time
(for example, the definition of a new CSS property).
However, for some times of extension points (e.g., new DOM events),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't parse this sentence.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/times/types/ ?

index.bs Outdated
(for example, the definition of a new CSS property).
However, for some times of extension points (e.g., new DOM events),
limitation to secure contexts should just
be defined in normative prose in the specification.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the new event comes with a new interface it's quite easy to restrict it though. Maybe you should clarify this by talking about "dispatching an event" instead.

index.bs Outdated
:: If a feature depends on
the expectations of authentication, integrity, or confidentiality
that are met only in secure contexts,
then it should be limited to secure contexts,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

must?

Copy link

@mikewest mikewest left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with Anne's general point that being more aggressive about the recommendations would be helpful. My goal inside Chrome is to reverse the default assumption, such that features generally ought to be restricted to secure contexts, and access to a feature over non-secure connections is the exception that needs to be justified.

I'd love to see this reformulated along those lines, as the TAG's opinions on the topic help me make the case internally that this isn't just crazy ol' Mike's opinion. :)

index.bs Outdated
First, it helps encourage Web content and applications
to migrate to secure contexts.
Second, it can restrict new APIs where authentication, integrity, or confidentiality
are important to prevent substantial increases to the privacy or security risks of using the Web.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest reversing these. At least from Chrome's perspective, the latter has been the the overriding concern internally.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But the first is the main goal. If you make the second the main goal, it's easier for folks to weasel their way into an exception.

@dbaron
Copy link
Member Author

dbaron commented Aug 25, 2017

OK, I think I've addressed the feedback so far. I think it's worth another round of review at this point.

I worry a little bit that I may have come down a little too hard in terms of requiring that all parts of a new feature be hidden. Maybe it's ok to just hide the major pieces (and primary detection points) such that it's not usable and not detected as present. But it wasn't obvious to me how to fix that in my current wording...

Copy link
Member

@annevk annevk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM % nit

index.bs Outdated
since sending untrusted data to a USB device could damage that device
or compromise computers that the device connects to.

Specification authors can most features defined in
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can limit*

@travisleithead
Copy link
Contributor

I also think the intro is too heavy-handed regarding requiring all new features to be in secure contexts. I think the subsequent sections are actually pretty good, outlining some principles that help shape the decision in a more nuanced way (and tend to disagree with the all aspect laid down before).

Perhaps this could be softened by suggesting that a feature should be considered for Secure Context by default given the principles [described after that].

From my implementer's hat, I'm particularly sensitive to plumbing yet another mode through the platform (speaking as a veteran of IE's document modes). We already have quirks/almost standards/standards mode, and now we'd have Secure Context as well... For APIs exposed to script, I'm OK with having this mode because the "switch" is processed once (when the type system is initialized for a given script engine) and never consulted again. Plumbing the switch into the formatting/layout system or event system is far-less clean and has performance overhead.

Applying this to CSS also seems a little questionable to me.

I understand the desire to put new Houdini features behind secure context. Those features tend to rely on JavaScript APIs for initialization. If the JS APIs are put behind secure context, then surely that cuts-off access to the related CSS properties?

Another example: for new conceptual CSS layout types, would we really want to prevent non-HTTPS sites from adopting a new kind of layout? A new layout seems to have no legitimate ties to "security" except that we want to use it as "bait" to switch the web to HTTPs...

Must we recommend that even new CSS properties should have secure context applied?

There may be a similar argument for events--though events can provide output that might leak sensitive information, and so this argument seems weaker to me.

@annevk
Copy link
Member

annevk commented Aug 30, 2017

Must we recommend that even new CSS properties should have secure context applied?

If we can, why not?

What's the rationale for continued support for insecure contexts?

@cynthia
Copy link
Member

cynthia commented Sep 13, 2017

What's the rationale for continued support for insecure contexts?

While a bit of a corner case - internal/isolated network services, in-development, prototypes come to mind. Working on secure context only features using a local server isn't that user friendly, since (AFAIK, correct me if I am wrong) you can't use acme to get a free certificate.

(Wonder what kind of rules need to be bent to get a CA to issue a certificate that only works on localhost origins..)

@plinss
Copy link
Member

plinss commented Sep 13, 2017 via email

@mikewest
Copy link

Must we recommend that even new CSS properties should have secure context applied?

If we can, why not?

What's the rationale for continued support for insecure contexts?

FWIW, I'm with Anne. If we can reasonably limit a given feature, we should.

At a minimum, I believe that stance should be our default. We can evaluate arguments on a case-by-case basis when that stance leads to results we're unhappy with. But folks should expect to have to make those arguments.

That all said, I believe localhost is (or will soon be) considered a secure context without HTTPS.

http://localhost/ will be considered a secure context in the spec as soon as we can convince IETF folks to accept https://tools.ietf.org/html/draft-west-let-localhost-be-localhost. Chrome's out ahead of this a little bit (https://groups.google.com/a/chromium.org/d/msg/blink-dev/RC9dSw-O3fE/E3_0XaT0BAAJ), and I hope other vendors will do the same.

You can also generate a self-signed certificate with any common/alt name you want and tell your browser to trust it for local testing. So this shouldn’t be a burden for developers.

Exactly this. Browsers should also do a better job of allowing developers to treat a given origin as "secure enough" for development purposes. Chrome has command-line flags, but we should really embed it into devtools somehow.

@martinthomson
Copy link
Contributor

@mikewest, there isn't a lot of visibility of draft-west-let-localhost-be-localhost. I think that we're somewhat stalled on that. I don't know where you were discussing it though. There are alternatives though on the browser side. Maybe we should discuss that some more; I've some ideas here that I need to look into first.

@mikewest
Copy link

@martinthomson: Just finished a call for adoption in DNSOP, which I think was successful though the chairs haven't yet confirmed their view. The main hangup is the fork in https://tools.ietf.org/html/draft-west-let-localhost-be-localhost-06#section-4.2. Strong opinions on both sides. shrug

@martinthomson
Copy link
Contributor

Thanks for the pointer. I predict that it will be adopted but that you will eventually regret ever attempting this. It will be an RFC worth having though. Congratudolences.

@triblondon
Copy link
Contributor

At first I though that we in the TAG were being congratudoliated, but it was @mikewest that had that honour.

- require justification even for not-new-feature
- say that the TAG can be consulted
- expand on feature detection equivalence with unimplemented features
- restructure a bit in order to do both of the above
Copy link

@mikewest mikewest left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some more high-level comments. Thanks for following up on this @dbaron!

index.bs Outdated
is discouraged and requires strong justification.
The TAG is interested in hearing about and discussing cases
where it is unclear whether exposing the capability
in non-secure contexts is justifiable.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sentence is odd, as it directly undercuts the rest of the paragraph. Have the courage of your convictions!

If y'all feel the need to weaken the claim that "New capabilities added to the Web should be available only in secure context", I'd suggest doing so weakly. Perhaps "There may be reasonable justification for exposing a given capability in non-secure contexts; the TAG is interested in hearing about those edge cases, and working to resolve them."?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about just:

The TAG is interested in hearing about and working to resolve any cases where exposure in non-secure contexts is being seriously considered.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or, simpler:

The TAG is interested in hearing about cases
where exposing new features in non-secure contexts is being considered.

(I'm pulling the "working to resolve" because I suspect that much of what we hear about might qualify as "not a new feature".)

(for example, the definition of a new CSS property).
However, for some types of extension points (e.g., dispatching an event),
limitation to secure contexts should just
be defined in normative prose in the specification.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: I'd suggest that you move this paragraph up above the "And here are some exceptions" bit. Then the structure would be something like:

  1. Y'all should do this thing.
  2. Here's why you should do this thing.
  3. Here's how you should do this thing.
  4. And if you really don't want to do this thing, here's some things to think about.

That seems like a clearer message to me.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I reordered the paragraphs.

it is not possible for developers to detect whether a feature is present,
limiting the feature to secure contexts
might cause problems
for libraries that may be used in either secure or non-secure contexts.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This paragraph seems like a distinct design principle ("Thou shalt enable feature detection.") that you could discuss at length elsewhere, and reference here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I moved this into #82 and revised the text.

developer confusion about where the boundaries are.
We also don't want to increase
the complexity of implementations of Web technology
by requiring tests for secure contexts in too many *types* of places.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't really understand this claim. Can you help me out? What "types of places" do you mean? The example below didn't help me (but I'm also not really a CSS guy, so the distinction between the difficulty of detecting a new property vs new syntax isn't clear to me... it seems like the former would be easier, though?). :(

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I mean by this is that for [SecureContext] annotations in WebIDL, the annotation presumably gets stored as a bitflag or similar, and gets checked in a small set of places that implement [SecureContext]. Likewise for CSS properties, engines presumably have a set of data about each property, to which a secure-contexts-only bit could be added, and likewise tested in a small number of places. But it seems preferable to avoid littering IsSecureContext() tests through the CSS parser (or other language parsers), and I think this preference likely aligns with the previous justification.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For WebIDL, Chrome checks when generating bindings for a given context (e.g. everything is wrapped up in the exposed checks). For deprecations or places where [SecureContext] isn't relevant, we do inline checks at the entry point to the API (e.g. https://cs.chromium.org/chromium/src/third_party/WebKit/Source/modules/geolocation/Geolocation.cpp?rcl=2bd5f03512bcb0b0632366109612ea4e9c4b7ce2&l=220).

I think I agree with the thrust of your comments here, but I still don't really understand what this paragraph is telling feature designers. Does it boil down to "Use [SecureContext] when possible?"?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it boils down to "use [SecureContext] and equivalent things for other languages like CSS", and maybe be a little more hesitant about secure context restrictions that would need to be in other places.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make it a requirement for specs to have web platform tests so that all browsers are consistent in how these features are hidden/fail?

Such that "For deprecations or places where [SecureContext] isn't relevant, we do inline checks at the entry point to the API" behaves consistently?

index.bs Outdated
the expectations of authentication, integrity, or confidentiality
that are met only in secure contexts,
then it must be limited to secure contexts,
even if the other factors above could justify exposing it in non-secure contexts.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see it as the feature depending on authentication, integrity, or confidentiality, but instead the feature posing some risk to user privacy or security which is mitigated only by requiring authentication, integrity, and confidentiality. I mean, at some level, all features depend on the page's integrity, right? :)

WDYT about something like "If a feature poses a risk to user privacy or security which can be mitigated by requiring authentication, integrity, and confidentiality, the feature must be limited to secure contexts, ..."?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To avoid the "poses a risk" in your wording that makes it sound like the feature is problematic, I think I'm going to try:

If a feature would pose a risk to user privacy or security without the authentication, integrity, or confidentiality that is present only in secure contexts, then the feature must be limited to secure contexts, ...

@mikewest
Copy link

@dbaron: Friendly ping. We're having some conversations about this internally in Chrome, and a clear position statement from the TAG would be helpful in that discussion. :) /cc @slightlyoff

Anything I can do to help out?

@dbaron
Copy link
Member Author

dbaron commented Oct 10, 2017

Yeah, I managed to miss the github email from your review comments 6 days ago. I'd been hoping to (a) polish the text I had so far, which was a bit rough, (b) go through the blink-api-owners thread and now also (c) go through your comments above. Not going to happen today, but hopefully sometime later this week.

dbaron added a commit to dbaron/design-principles that referenced this pull request Oct 13, 2017
This was originally part of w3ctag#75, but it seemed worth splitting out both
into a separate section and a separate pull request.
@hober
Copy link
Contributor

hober commented Jan 16, 2018

I don't support limiting features to secure contexts solely as a carrot to encourage HTTPS adoption. Limiting new features that have privacy or security implications to secure contexts is sensible of course, but absent such reasons the implementation and authoring costs of fragmenting the platform generally outweigh the benefit of paternalism here. I'm glad @dbaron's planning to expand the exception around implementation complexity; I would go farther and carve out a similar exception for authoring complexity (again, when the feature does not have privacy or security concerns).

@pinobatch
Copy link

And in addition to "implementation complexity" for web browser developers, another aspect of "implementation complexity" is complexity for server owners. It's one thing for the operator of a web server reachable through the Internet to build a secure context. It's a bit more difficult for, say, the operator of a consumer-grade router, printer, or network-attached storage (NAS) device on a private home LAN. I estimate that it would include at least the annual cost of a real domain name, plus the annual cost of a domain-validated certificate should the single point of failure that is Let's Encrypt go under, plus one or more For Dummies books.

@othermaciej
Copy link

@hober's comment above is the general consensus at Apple. We think features should be restricted to secure contexts only when there is a privacy or security reason to do so for that specific feature. And it definitely should not be done for features where it would require constantly checking the isSecure bit during parsing of languages like CSS, WebAssembly or JavaScript.

While we agree with the goal of getting more of the web onto HTTPS, we don't think forking the web platform is an acceptable cost for doing so.

@natewiebe13
Copy link

Provided localhost is considered a secure context, making secure contexts a requirement for new features is a welcome change. Working locally, you can still prototype features without having to setup a certificate. Once you deploy the code to a server, there should be no problem as setting up encryption using certbot is very simple. I would argue that using consumer grade devices that have their own webserver isn't a cause for concern, as this is targeting new features, not ones that have already shipped. For example, most of these devices are using very outdated techniques since they are trying to support a large number of browsers (usually as old as IE9). None I've seen are using flexbox, CSS grid, or any web technology developed within the past few years.

Lastly, since certificates are essentially required for HTTP2. If we can get the web on HTTPS, it will make the transition to HTTP2 even easier.

@othermaciej
Copy link

Secure Contexts says UAs MAY treat localhost as a secure context only if they can guarantee it will only ever resolve to a loopback address (and are in any case not required to). https://w3c.github.io/webappsec-secure-contexts/#localhost

@pinobatch
Copy link

@natewiebe13 It's not always practical to test on localhost on mobile.

@littledan
Copy link
Contributor

I'm not sure how or whether this recommendation should be applied to programming languages like JavaScript and WebAssembly. TC39 has been trying to avoid making more JavaScript language modes with our "1JS" policy. Just from a parsing perspective, for example, adding another parameter to the grammar with new constructs banned adds significant complexity to the language definition, implementations and the testing matrix.

JavaScript library functions have not added I/O capabilities. The specification is currently not organized to give some global objects some library capabilities and others not; all global objects get all of the library. A change to this policy is possible, but it might relate to the in-development Realms specification (cc @erights). The most prominently security-relevant recent feature is SharedArrayBuffer, but the TC39 decision (taking into account apparent cross-browser consensus) has been to not remove SharedArrayBuffer from the ECMAScript specification.

@annevk
Copy link
Member

annevk commented Feb 5, 2018

TC39 has been trying to avoid making more JavaScript language modes with our "1JS" policy.

The difference is that this mode will go away long term, whereas class/module/non-strict/strict are likely to stick around.

@littledan
Copy link
Contributor

@annevk Many also have the goal that most/all JS code will be be able to transition to modules and in strict mode in the future.

@annevk
Copy link
Member

annevk commented Feb 5, 2018

Most seems reasonable, but all seems far less realistic given event handlers, legacy scripts, etc.

@littledan
Copy link
Contributor

@annevk I'm not going to argue about whether non-secure contexts will go away, but, just saying this recommendation could be taken to affect JavaScript significantly in a way we're not currently working towards, or alternatively, it could be interpreted to be out of scope.

Is anyone who is working on this policy interested in presenting it to TC39? (or at least discuss it in a bug on the ECMA-262 repo). Although the next meeting might be a little hard for some to attend physically (it's in London), it's possible to call into a VC. If you're associated with the W3C but not Ecma, it's possible to attend as a liaison or invited expert, just a matter of moving a few papers.

@littledan
Copy link
Contributor

@annevk To clarify, for JavaScript, do you think this policy should apply to things in TC39's ECMAScript standard itself, or only to Web APIs defined in other specifications on top of it?

@littledan
Copy link
Contributor

I asked TC39 if they would like to adopt this policy, and so far the response has been widespread skepticism.

@daurnimator
Copy link

There is a good discussion around this issue in this hackernews thread: https://news.ycombinator.com/item?id=16337998

@pinobatch
Copy link

@natewiebe13 wrote:

most of these devices are using very outdated techniques since they are trying to support a large number of browsers

Let me describe the NAS case in more detail. The Secure Contexts spec proposes requiring a secure context for the Fullscreen API to make phishing by spoofing the operating system UI more difficult. A NAS on a home LAN might use the Fullscreen API to let users view videos stored on its drive. What certificate should the NAS's web server use once that becomes the case?

@annevk
Copy link
Member

annevk commented Feb 19, 2018

@littledan I think it depends on the feature and the impl complexity, but if a major new feature like modules would come along today, that seems like something we should restrict.

@annevk
Copy link
Member

annevk commented Feb 19, 2018

@pinobatch
Copy link

@annevk Plex can pay for the DigiCert partnership with from revenue from users who subscribe to Plex Pass ($5 per month, $40 per year, or $120 lifetime). Many developers of free software server applications have no analogous revenue source.

@annevk
Copy link
Member

annevk commented Feb 19, 2018

You could build equivalent infrastructure on top of Let's Encrypt, no?

@astoeckel
Copy link

astoeckel commented Feb 19, 2018

Certainly, you could get a Let's Encrypt wildcard certificate for *.<USER>.domain and setup a DNS server that resolves <IP>.<USER>.domain with <IP>, and hand out the certificate private key to the user. However, I think that this solution is mildly insane and still doesn't solve the overall problem:

  • This is significant overhead. We're talking about requiring infrastructure for every (free) software project that allows you to run a HTTP server on your local network. Why should a local server depend on any remote infrastructure? You may still want to use this software in ten years, when this infrastructure no longer exists.
  • If the IP-based certificate provider is a publicly available service, this approach would be less secure than just connecting to a local HTTPS server via its IP address. The certificate is not signifying the authenticity of the server/device you're connecting to, but merely that the certificate has been issued to a person having control over *.domain. Furthermore, you would be trusting a (potentially) third party in actually returning the correct IP in the DNS query. But hey, you got a "green lock symbol" in the browser, so it must be secure, right?
  • In the Let's Encrypt case the certificate needs to be renewed every three months. Believe it or not, there are regions in the world where no permanent access to the global internet exists; instead, access to information (think dumps of Wikipedia, etc.) are provided in a local network. There is absolutely no reason to artificially restrict these web applications to an old subset of JavaScript APIs by not providing a user-friendly way to trust a local HTTPS connection.

My proposal would be to only introduce secure contexts for new JavaScript features if at the same time there is a standardised way to get a "trust certificate on first use" mechanism (think SSH) in the browser, that does not scare users away. I've attached my humble attempt at a mockup of what something like that may look like in the UA (comparison of what it looks like right now, and what it may look like).

mokup svg

@dbaron
Copy link
Member Author

dbaron commented Apr 6, 2018

After the face-to-face discussion yesterday, the TAG concluded we couldn't come to consensus on strong advice about limiting features to secure contexts. @slightlyoff drafted some text that we could have consensus on, which I've now merged with the text that was here. Given how long the history is here, I decided to create the new proposal as a separate pull request in #89 rather than continuing to revise this one.

@travisleithead
Copy link
Contributor

Closing, as this was overtaken by #89

@fantasai
Copy link

fantasai commented Apr 9, 2018

FWIW, I would like to throw my (relatively insubstantial) weight 100% behind @hober’s comment here. Authoring complexity matters, release-date–based modes are capricious, and CSS authors often do not have the influence over server configs to escape this trap, making it particularly vicious to impose on them.

Imho, W3C TAG should be recommending against policies like “all new Web platform features added after X date are HTTPS-only, because we want to have more HTTPS carrots”. Such is not a policy crafted in service of good technical architecture: it is a marketing project being implemented as technical architecture. I don't believe marketing is a good basis for making decisions about the architectural foundations of the Web platform, and in this case I do consider it harmful, for the various reasons described by others in this thread.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

When should [SecureContext] be used and why?