<--- Back to Details
First PageDocument Content
Kullback–Leibler divergence / Thermodynamics / Divergence / Markov chain / Maximum likelihood / Gibbs sampling / Mean difference / Expectation–maximization algorithm / Statistics / Statistical theory / Estimation theory
Date: 2014-11-07 00:39:15
Kullback–Leibler divergence
Thermodynamics
Divergence
Markov chain
Maximum likelihood
Gibbs sampling
Mean difference
Expectation–maximization algorithm
Statistics
Statistical theory
Estimation theory

Projecting Markov Random Field Parameters for Fast Mixing

Add to Reading List

Source URL: users.cecs.anu.edu.au

Download Document from Source Website

File Size: 605,87 KB

Share Document on Facebook

Similar Documents

Computing / Computer memory / Hardware acceleration / Central processing unit / Parallel computing / Computer hardware / Dynamic random-access memory / Computer architecture / Field-programmable gate array

FPGAs as Streaming MIMD Machines for Data Analy9cs James Thomas, Matei Zaharia, Pat Hanrahan CPU/GPU Control Flow Divergence

DocID: 1xVnb - View Document

Mathematical analysis / Mathematics / Post-quantum cryptography / Number theory / Measure theory / Cryptography / Algebra / Distribution / Support / Learning with errors / Weight / KullbackLeibler divergence

Improved security proofs in lattice-based cryptography: using the Rényi divergence rather than the statistical distance Shi Bai1 , Tancrède Lepoint3 , Adeline Roux-Langlois4 , Amin Sakzad5 , Damien Stehlé2 , and Ron S

DocID: 1xTbf - View Document

Vector differential operators (r, ϕ, z). Cylindrical Coordinates • Divergence

DocID: 1vsgG - View Document

Black-box α-divergence for Deep Generative Models Thang D. Bui∗ University of Cambridge José Miguel Hernández-Lobato∗

DocID: 1vpGS - View Document

QUARTERLY MARKET DIGEST Global Equity Divergence July 16, 2018 Q2 Quick Summary 

DocID: 1vgxM - View Document