• Bayesian inference has the potential to address shortcomings of deep neural networks (DNNs) such as poor calibration. However, scaling Bayesian methods to modern DNNs is challenging. This blog post describes subnetwork inference, a method that tackles this issue by doing inference over only a small, carefully selected subset of the DNN weights.
  • Automating the design of molecules with desirable properties can greatly accelerate the search for novel drugs and materials. However, to make further progress we need to go beyond graph-based approaches. In this blog post, we use ideas from reinforcement learning and quantum chemistry to make a first step towards 3D molecular design.
  • What does it mean to combine variational inference with natural gradients? Can this scale to neural networks? What kind of approximations do we need to make? We take a detailed look at the mathematical derivations of such algorithms.
  • The theory of subjective probability describes ideally consistent behaviour and ought not, therefore, be taken too literally.
    — Leonard Jimmie Savage (1917–1971)
  • The theory of probabilities is at bottom nothing but common sense reduced to calculus; it enables us to appreciate with exactness that which accurate minds feel with a sort of instinct for which ofttimes they are unable to account.
    — Pierre-Simon Laplace (1749–1827)