Skip to content
This repository has been archived by the owner on Nov 6, 2023. It is now read-only.

Re-activate top-100 from #10538 #10545

Closed
20 tasks done
cschanaj opened this issue Jun 21, 2017 · 12 comments
Closed
20 tasks done

Re-activate top-100 from #10538 #10545

cschanaj opened this issue Jun 21, 2017 · 12 comments

Comments

@cschanaj
Copy link
Collaborator

cschanaj commented Jun 21, 2017

#10536 #10538 I cannot do this all myself. Please mark this as a Good Volunteer Work. Thanks!

top-100

@Bisaloo
Copy link
Collaborator

Bisaloo commented Jun 21, 2017

@Bisaloo
Copy link
Collaborator

Bisaloo commented Jun 21, 2017

@Bisaloo
Copy link
Collaborator

Bisaloo commented Jun 21, 2017

@Hainish, but ultimately, those lists are pulled from hstspreload.org, right ? There is simply a delay before domains are included in the lists you mention.

This form is used to submit domains for inclusion in Chrome's HTTP Strict Transport Security (HSTS) preload list. This is a list of sites that are hardcoded into Chrome as being HTTPS only.

Most major browsers (Chrome, Firefox, Opera, Safari, IE 11 and Edge) also have HSTS preload lists based on the Chrome list. (See the HSTS compatibility matrix.)

That's how I understand this at least. Please tell me if I am mistaken.

@Hainish
Copy link
Member

Hainish commented Jun 21, 2017

The browsers are based on the Chrome list, yes. But it may take a while before they are included in the browsers, so we don't want to remove the rules that are preloaded from the canonical source prematurely and leave them actually unprotected in the browsers.

@Hainish
Copy link
Member

Hainish commented Jun 21, 2017

Also Mozilla has its own logic for removing preloaded domains, and it's often changing and not documented very well. So we can't count on the Mozilla implementation mirroring the Chrome list very closely.

@cschanaj
Copy link
Collaborator Author

cschanaj commented Jun 22, 2017

I've just found out that Yahoo.xml was whitelisted in the ruleset-coverage-whitelist.txt. The file contain 2.4k+ lines and involves dozens of complicated rules, I don't think that anyone can re-activate it before the next release without paying significant effort ...

@Bisaloo
Copy link
Collaborator

Bisaloo commented Jun 22, 2017

I've just found out that Yahoo.xml was whitelisted in the ruleset-coverage-whitelist.txt. The file contain 2k+ lines and involves dozens of complicated rules, I don't think that anyone can re-activate it before the next release without significant effort ...

Agreed. This is why we should keep different domains in separate rulesets. Updating behemoths such as Yahoo.xml is no fun and this leaves all domains unprotected.

@cschanaj
Copy link
Collaborator Author

cschanaj commented Jun 22, 2017

If nobody can re-activate Yahoo.xml before the next release, I suggest to revert the changes from 9bf49cb on Yahoo.xml and whitelist it again. This gives us the time to rewrite the ruleset while keeping the domain protected.

Remark: original file prior to the comprehensive-fetch Yahoo.xml

@Hainish
Copy link
Member

Hainish commented Jun 22, 2017

It seems like at the very least we should remove the problematic domains flagged by the fetch test, even if we don't have time to go through the entire rule and simplify it.

@Hainish
Copy link
Member

Hainish commented Jul 5, 2017

Closing, since this seems to be done now.

@Hainish Hainish closed this as completed Jul 5, 2017
@J0WI
Copy link
Contributor

J0WI commented Jul 6, 2017

Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

4 participants