Probabilistic deep learning
See:
# Resources
- A Comprehensive Introduction to Bayesian Deep Learning
- Bayesian Neural Network tutorial
- Bayesian Deep Learning - NeurIPS Workshop
- Deep Learning Is Not Good Enough, We Need Bayesian Deep Learning for Safe AI
- Making Your Neural Network Say “I Don’t Know” — Bayesian NNs using Pyro and PyTorch
- Building a Bayesian deep learning classifier
- Physics - a Gateway to Bayesian Deep Learning
- Bayesian deep learning with Fastai : how not to be uncertain about your uncertainty!
- BNNs are a way to add uncertainty handling in our models. The idea is simple, instead of having deterministic weights that we learn, we instead learn the parameters of a random variable which we will use to sample our weights during forward propagation. Then, to learn the parameters, we will use backpropagation, sometimes with a little trick to make our parameters differentiable.
- Dropout is a way to make your Neural Networks Bayesian almost for free, and to use it during inference you just have to keep the Dropout, and sample several models, this is called MC Dropout.
# Monte Carlo Dropout
- Monte Carlo Dropout
- What is MC Dropout
- normal dropout (only at training time) serves as a regularization to avoid overfitting. During test time, dropout is not applied; instead, all nodes/connections are present, but the weights are adjusted accordingly (e.g. multiplied by the keep ratio, which is 1 - dropout_ratio). Such a model during test time can be understood as a average of an ensemble of neural networks.
- Notice that for normal dropout, at test time the prediction is deterministic. Without other source of randomness, given one test data point, the model will always predict the same label or value.
- For Monte Carlo dropout, the dropout is applied at both training and test time. At test time, the prediction is no longer deterministic, but depending on which nodes/links you randomly choose to keep. Therefore, given a same datapoint, your model could predict different values each time.
- The primary goal of MC dropout is to generate random predictions and interpret them as samples from a probabilistic distribution.
- #TALK Estimacion de la Incertidumbre en Redes Neuronales (Valdenegro)
# Code
- #CODE Pyro (Uber) - Deep universal probabilistic programming with Python and PyTorch
- #CODE Blitz - Bayesian Layers in Torch Zoo
- #CODE Bean machine (Meta/Facebook)
- #CODE
Edwardlib
- Edward is a Python library for probabilistic modeling, inference, and criticism
- https://theintelligenceofinformation.wordpress.com/2017/06/02/pydata-london-2017-bayesian-deep-learning-talk-by-andrew-rowan/
- #TALK https://www.youtube.com/watch?v=I09QVNrUS3Q
- http://willwolf.io/2017/06/15/random-effects-neural-networks/
- #CODE TensorFlow Probability
- #CODE
keras-uncertainty
- Monte Carlo Dropout (MC-Dropout)
- Deep Ensembles
# Books
- #BOOK Probabilistic Graphical Models: Principles and Techniques (Koller, 2009 MIT)
- #BOOK Probabilistic Deep Learning - With Python, Keras and TensorFlow Probability (Durr, MANNING 2020)
# Talks
# Courses
# References
- #PAPER Probabilistic machine learning and artificial intelligence (Ghahramani 2015)
- #PAPER Dropout as a Bayesian Approximation:Representing Model Uncertainty in Deep Learning (Gal 2016)
- #PAPER Bayesian Neural Networks (Mullachery, 2018)
- #PAPER Deep Sub-Ensembles for Fast Uncertainty Estimation in Image Classification (Valdenegro-Toro 2019)
- #PAPER Bayesian Recurrent Neural Networks (Fortunato 2019)
- #PAPER Bayesian Deep Learning and a Probabilistic Perspective of Generalization (Gordon Wilson, 2020)
- #PAPER Hands-on Bayesian Neural Networks - a Tutorial for Deep Learning Users (Jospin 2020)
- #PAPER DropConnect is effective in modeling uncertainty of Bayesian deep networks (Mobiny 2021)
- #PAPER Epistemic Neural Networks (Osband 2021)
- #PAPER Uncertainty Baselines: Benchmarks for Uncertainty & Robustness in Deep Learning (Nado 2021)
- #PAPER Probabilistic Deep Learning with Probabilistic Neural Networks and Deep Probabilistic Models (Chang 2021)
- #PAPER #REVIEW A Survey of Uncertainty in Deep Neural Networks (Gawlikowski 2022)