Our systems are now restored following recent technical disruption, and we’re working hard to catch up on publishing. We apologise for the inconvenience caused. Find out more

Recommended product

Popular links

Popular links


Variational Bayesian Learning Theory

Variational Bayesian Learning Theory

Variational Bayesian Learning Theory

Shinichi Nakajima, Technische Universität Berlin
Kazuho Watanabe, Toyohashi University of Technology
Masashi Sugiyama, University of Tokyo
August 2019
Hardback
9781107076150

    Variational Bayesian learning is one of the most popular methods in machine learning. Designed for researchers and graduate students in machine learning, this book summarizes recent developments in the non-asymptotic and asymptotic theory of variational Bayesian learning and suggests how this theory can be applied in practice. The authors begin by developing a basic framework with a focus on conjugacy, which enables the reader to derive tractable algorithms. Next, it summarizes non-asymptotic theory, which, although limited in application to bilinear models, precisely describes the behavior of the variational Bayesian solution and reveals its sparsity inducing mechanism. Finally, the text summarizes asymptotic theory, which reveals phase transition phenomena depending on the prior setting, thus providing suggestions on how to set hyperparameters for particular purposes. Detailed derivations allow readers to follow along without prior knowledge of the mathematical techniques specific to Bayesian learning.

    • Provides a detailed theory of variational Bayesian learning and suggests various applications
    • Introduces and covers recent developments in non-asymptotic and asymptotic theory
    • The content is accessible to students without prior knowledge of techniques, featuring detailed derivations and explanations of new concepts

    Reviews & endorsements

    'This book presents a very thorough and useful explanation of classical (pre deep learning) mean field variational Bayes. It covers basic algorithms, detailed derivations for various models (eg matrix factorization, GLMs, GMMs, HMMs), and advanced theory, including results on sparsity of the VB estimator, and asymptotic  properties (generalization bounds).' Kevin Murphy, Research scientist, Google Brain

    'This book is an excellent and comprehensive reference on the topic of Variational Bayes (VB) inference, which is heavily used in probabilistic machine learning. It covers VB theory and algorithms, and gives a detailed exploration of these methods for matrix factorization and extensions. It will be an essential guide for those using and developing VB methods.' Chris Williams, University of Edinburgh

    See more reviews

    Product details

    August 2019
    Hardback
    9781107076150
    558 pages
    235 × 156 × 34 mm
    0.9kg
    Available

    Table of Contents

    • 1. Bayesian learning
    • 2. Variational Bayesian learning
    • 3. VB algorithm for multi-linear models
    • 4. VB Algorithm for latent variable models
    • 5. VB algorithm under No Conjugacy
    • 6. Global VB solution of fully observed matrix factorization
    • 7. Model-induced regularization and sparsity inducing mechanism
    • 8. Performance analysis of VB matrix factorization
    • 9. Global solver for matrix factorization
    • 10. Global solver for low-rank subspace clustering
    • 11. Efficient solver for sparse additive matrix factorization
    • 12. MAP and partially Bayesian learning
    • 13. Asymptotic Bayesian learning theory
    • 14. Asymptotic VB theory of reduced rank regression
    • 15. Asymptotic VB theory of mixture models
    • 16. Asymptotic VB theory of other latent variable models
    • 17. Unified theory.