Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature request: dropout during prediction phase #41

Open
goodmansasha opened this issue Jan 4, 2017 · 4 comments
Open

feature request: dropout during prediction phase #41

goodmansasha opened this issue Jan 4, 2017 · 4 comments

Comments

@goodmansasha
Copy link

goodmansasha commented Jan 4, 2017

Yarin Gal's thesis demonstrated that by allowing dropout during prediction, not just training, a set predictions can be produced and then analyzed to reveal model uncertainty (the quantiles of the predictions can produce confidence intervals, for example--like bootstrapping). The thesis has been praised several times by more senior researchers, although I don't have those links right now (the thesis is here: http://mlg.eng.cam.ac.uk/yarin/blog_2248.html ). One link is here, where Yann LeCun mentions that he developed a similar technique decades ago, and that he was Bayesian before it was cool: http://mlg.eng.cam.ac.uk/yarin/website/blog_3d801aa532c1ce.html

Today I got keras-js working, but after searching the code it appears the dropout layers are passthrough for prediciton.

In Keras, dropout can be done during prediction also with K.Function(model.inputs + [K.learning_phase()], model.outputs) or something like this ( see yaringal/BayesianRNN#3 (comment) ) . There is also a rule for recurrent layers so the layers use the same dropout mask in each layer, as is also the case in native tensorflow and torch.

Unfortunately, I don't know how to implement this in keras-js because it requires a particular combination of expert knowledge of javascript, keras and neural networks, but could perhaps help a bit because i am familiar with the three (e.g I've taken Andrew Ng's coursera course and am almost finished with Hinton's). Please advise.

I envision this as a possible 'killer application' of keras-js...producing confidence intervals on particular predictions without the overhead of taxing a main server, data visualization, visualizing uncertainty, and more. Please consider this.

@transcranial
Copy link
Owner

I've been reading some of Yarin's work. This is quite interesting! It could be a nice unique feature, especially if it could be made interactive. It would probably be implemented separate from the Dropout layer -- that's there purely for 1-to-1 API compatibility purposes.

@goodmansasha
Copy link
Author

Where do you think it be implemented at this time? This codebase appears organized around keras layers, but for something like an RNN the dropout mask is repeated across layers and the code might need more context like the Tensorflow computation graph maybe (?).

@andrisecker
Copy link

any updates on this? or if it's not core Keras does anybody have a workaround to enable dropout during test time?

@ni9elf
Copy link

ni9elf commented Jun 5, 2018

To enable dropout at test time, have a look at keras-team/keras#9412 and keras-team/keras#1606

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants