Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent Whoops from being exposed on non-development environment #604

Closed
kktsvetkov opened this issue Nov 11, 2018 · 6 comments
Closed

Prevent Whoops from being exposed on non-development environment #604

kktsvetkov opened this issue Nov 11, 2018 · 6 comments

Comments

@kktsvetkov
Copy link

TL;TR
Don't do anything with PrettyPageHandler if you are on a real public IP address or if the user-agent is a "bot".

Longer Version:

You can't fix stupid, but you can make it harder for careless developers to shoot themselves in the foot.

Earlier today I was commenting on the dumping arguments change, and the example I gave I found googling for "Whoops! There was an error". You've probably seen all of that already.

Exposing your project's exceptions is bad, and it is even worse if the Whoops page shares some sensitive data such as credentials and keys/tokens. I ask was asking if the blacklist feature must cover the arguments dump as well. @jonasdt commented that Whoops is meant to be used for development environments only anyway, and I agree -- it's the developers' responsibility to set it up correctly. Nevertheless, Google is full of crawled Whoops page.

My suggestion is that add two extra checks that will block the exception rendering:

  1. allow to proceed if SERVER_ADDR is a private IP (10., 172.16., 192.168.*), deny if SERVER_ADDR is public IP address. My assumption is that most development environments are done on private networks, so looking at the IP is a somewhat good way to determine what environment Whoops is running in.

  2. deny rendering the exception if the user-agent is a "bot". My assumption is that even if the Whoops rendering of the exception is publicly available, at least deny the crawling bots to index it. If you look at cached Google Whoops pages, you are going to see that they have:

  • HTTP_FROM googlebot(at)googlebot.com
  • HTTP_USER_AGENT Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
  • HTTP_USER_AGENT Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

So, It's a long shot, but anyway - this is a good precaution against careless developers.

@shadowhand
Copy link

shadowhand commented Nov 11, 2018

My assumption is that most development environments are done on private networks, so looking at the IP is a somewhat good way to determine what environment Whoops is running in.

A lot of public environments are private networks with a public load balancer in front. I still think this is a good idea but it probably won't catch as many public usages as one would hope.

Another suggestion: PrettyPageHandler will throw an exception upon construction unless some ENV flag is set:

if (! filter_var(getenv('IS_DEV'), FILTER_VALIDATE_BOOLEAN)) {
    throw new \RuntimeException("PrettyPageHandler can only be used in development");
}

Another possibility: eliminate almost every bit of output from PrettyPageHandler and force developers to manually enable it:

$handler = new PrettyPageHandler();
$handler->allowEnvironmentDetails(true);
$handler->allowStackTrace(true);
$handler->allowCodeLocation(true);

At least then PrettyPageHandler wouldn't immediately be a security risk. It would on developers to do more than follow the readme.

@staabm
Copy link
Contributor

staabm commented Nov 11, 2018

We should use the X-Robots-Tag: no-index http header
https://developers.google.com/search/reference/robots_meta_tag?hl=de

@kktsvetkov
Copy link
Author

...will throw an exception upon construction unless some ENV flag is set

@shadowhand I think Laravel is set up in such a way when working with Whoops, looking at config('app.debug').

@staabm
Copy link
Contributor

staabm commented Nov 11, 2018

Regarding User-Agent sniffing: googlebot also crawls websites with other UA strings (e.g. real browser UAs). I dont see much value in UA detection.

@denis-sokolov
Copy link
Collaborator

This is a good suggestion. If anyone is up to implement it, I would be open to accept a PR.

https://github.com/filp/whoops/wiki/Possible-features-to-add

We try to keep the outstanding issues under control, so I will be closing this, as until someone implements this, there is no action to be done.

@SvenRtbg
Copy link
Contributor

And please remember: IPv6 is a thing now, only dealing with private IPv4 probably isn't enough.

Also note that Whoops already has "noindex" markup since 2017: #527

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants