It's cool how "bayesian neural networks define a distribution over neural networks" –– so you can "sample from the posterior." Reminds me of the Gaussian Process learning framework, which seems quite similar (distributions over functions). Has anyone thought of this overlap?

edit: posted before reading other comments; seems like @syntaxing's link is what I'm looking for: "A network with infinitely many weights with a distribution on each weight is a Gaussian process"

That a Bayesian NN "defines a distribution over neural networks" means that a Bayesian NN is a NN with priors on the weights. So it's not that there is some special similarity to GPs. BNNs are like any other Bayesian model, but now a NN is the likelihood.

The situation is usually flipped with Bayesian GP models -- a GP is usually used as a prior on the linear predictor.

If anyone wants to learn about more advanced, bleeding edge uncertainty estimation methods unconcerned with neural nets, I can highly recommend Andrew Pownuk & Vladik Kreinovich's 2018 book "Combining Interval, Probabilistic, and Other Types of Uncertainty in Engineering Applications":

Super interesting. I've been wanting to play around with BNNs after reading Yarin Gal's post on it [1] but my knowledge is limited on how to make it work. Does anyone here have a library or tutorial they recommend using Pytorch?

I think the documentation and tutorials are thorough and laid out well to ease you into Bayesian NN and generally handling uncertainty with Neural Networks + Distributions. There's some Pyro-specific constructs in there, but it's the easiest way to get into BNNs without lots of prior knowledge.

It's cool how "bayesian neural networks define a distribution over neural networks" –– so you can "sample from the posterior." Reminds me of the Gaussian Process learning framework, which seems quite similar (distributions over functions). Has anyone thought of this overlap?

edit: posted before reading other comments; seems like @syntaxing's link is what I'm looking for: "A network with infinitely many weights with a distribution on each weight is a Gaussian process"

That a Bayesian NN "defines a distribution over neural networks" means that a Bayesian NN is a NN with priors on the weights. So it's not that there is some special similarity to GPs. BNNs are like any other Bayesian model, but now a NN is the likelihood.

The situation is usually flipped with Bayesian GP models -- a GP is usually used as a prior on the linear predictor.

If anyone wants to learn about more advanced, bleeding edge uncertainty estimation methods unconcerned with neural nets, I can highly recommend Andrew Pownuk & Vladik Kreinovich's 2018 book "Combining Interval, Probabilistic, and Other Types of Uncertainty in Engineering Applications":

https://link.springer.com/book/10.1007%2F978-3-319-91026-0

Which basically represents a textbook summary of Pownuk's PhD thesis:

http://www.cs.utep.edu/vladik/pownukPhD.pdf

David McKay, 1992, Bayesian Methods for Adaptive Models http://www.inference.org.uk/mackay/thesis.pdf

Radford Neal, 1994, Bayesian Learning for Neural Network https://www.cs.toronto.edu/~radford/ftp/thesis.pdf

Super interesting. I've been wanting to play around with BNNs after reading Yarin Gal's post on it [1] but my knowledge is limited on how to make it work. Does anyone here have a library or tutorial they recommend using Pytorch?

[1] http://mlg.eng.cam.ac.uk/yarin/blog_3d801aa532c1ce.html

I'd check out Pyro, a probabilistic programming language built on Pytorch.

You can find Bayesian Neural Network examples starting here: http://pyro.ai/examples/bayesian_regression.html

I think the documentation and tutorials are thorough and laid out well to ease you into Bayesian NN and generally handling uncertainty with Neural Networks + Distributions. There's some Pyro-specific constructs in there, but it's the easiest way to get into BNNs without lots of prior knowledge.

https://twiecki.io/blog/2016/06/01/bayesian-deep-learning/

This is a good example. There’s not much info I’ve been able to find - I’d be interested if anyone else has a solid tutorial.

This pymc turorial is a nice place to start in this topic,

https://docs.pymc.io/notebooks/bayesian_neural_network_with_...