Monday, January 16, 2017

Edward: Deep Probabilistic Programming - implementation -

Dustin mentioned it on his Twitter feed:



Deep Probabilistic Programming by Dustin Tran, Matthew D. Hoffman, Rif A. Saurous, Eugene Brevdo, Kevin Murphy, David M. Blei

We propose Edward, a Turing-complete probabilistic programming language. Edward builds on two compositional representations---random variables and inference. By treating inference as a first class citizen, on a par with modeling, we show that probabilistic programming can be as flexible and computationally efficient as traditional deep learning. For flexibility, Edward makes it easy to fit the same model using a variety of composable inference methods, ranging from point estimation, to variational inference, to MCMC. In addition, Edward can reuse the modeling representation as part of inference, facilitating the design of rich variational models and generative adversarial networks. For efficiency, Edward is integrated into TensorFlow, providing significant speedups over existing probabilistic systems. For example, on a benchmark logistic regression task, Edward is at least 35x faster than Stan and PyMC3.
from the Edward page:

A library for probabilistic modeling, inference, and criticism.

Edward is a Python library for probabilistic modeling, inference, and criticism. It is a testbed for fast experimentation and research with probabilistic models, ranging from classical hierarchical models on small data sets to complex deep probabilistic models on large data sets. Edward fuses three fields: Bayesian statistics and machine learning, deep learning, and probabilistic programming.
It supports modeling with
  • Directed graphical models
  • Neural networks (via libraries such as Keras and TensorFlow Slim)
  • Conditionally specified undirected models
  • Bayesian nonparametrics and probabilistic programs
It supports inference with
  • Variational inference
    • Black box variational inference
    • Stochastic variational inference
    • Inclusive KL divergence: KL(pq)\text{KL}(p\|q)KL(pq)
    • Maximum a posteriori estimation
  • Monte Carlo
    • Hamiltonian Monte Carlo
    • Stochastic gradient Langevin dynamics
    • Metropolis-Hastings
  • Compositions of inference
    • Expectation-Maximization
    • Pseudo-marginal and ABC methods
    • Message passing algorithms
It supports criticism of the model and inference with
  • Point-based evaluations
  • Posterior predictive checks
Edward is built on top of TensorFlow. It enables features such as computational graphs, distributed training, CPU/GPU integration, automatic differentiation, and visualization with TensorBoard.

Authors

Edward is led by Dustin Tran with guidance by David Blei. The other developers are
We are open to collaboration, and welcome researchers and developers to contribute. Check out the contributing page for how to improve Edward’s software. For broader research challenges, shoot one of us an e-mail.
Edward has benefited enormously from the helpful feedback and advice of many individuals: Jaan Altosaar, Eugene Brevdo, Allison Chaney, Joshua Dillon, Matthew Hoffman, Kevin Murphy, Rajesh Ranganath, Rif Saurous, and other members of the Blei Lab, Google Brain, and Google Research.

Citation

We appreciate citations for Edward.
Dustin Tran, Alp Kucukelbir, Adji B. Dieng, Maja Rudolph, Dawen Liang, and David M. Blei. 2016. Edward: A library for probabilistic modeling, inference, and criticism. arXiv preprint arXiv:1610.09787.




Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly