Skip to content

Latest commit

 

History

History
60 lines (60 loc) · 2.15 KB

2021-07-01-bao21b.md

File metadata and controls

60 lines (60 loc) · 2.15 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Variational (Gradient) Estimate of the Score Function in Energy-based Latent Variable Models
This paper presents new estimates of the score function and its gradient with respect to the model parameters in a general energy-based latent variable model (EBLVM). The score function and its gradient can be expressed as combinations of expectation and covariance terms over the (generally intractable) posterior of the latent variables. New estimates are obtained by introducing a variational posterior to approximate the true posterior in these terms. The variational posterior is trained to minimize a certain divergence (e.g., the KL divergence) between itself and the true posterior. Theoretically, the divergence characterizes upper bounds of the bias of the estimates. In principle, our estimates can be applied to a wide range of objectives, including kernelized Stein discrepancy (KSD), score matching (SM)-based methods and exact Fisher divergence with a minimal model assumption. In particular, these estimates applied to SM-based methods outperform existing methods in learning EBLVMs on several image datasets.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
bao21b
0
Variational (Gradient) Estimate of the Score Function in Energy-based Latent Variable Models
651
661
651-661
651
false
Bao, Fan and Xu, Kun and Li, Chongxuan and Hong, Lanqing and Zhu, Jun and Zhang, Bo
given family
Fan
Bao
given family
Kun
Xu
given family
Chongxuan
Li
given family
Lanqing
Hong
given family
Jun
Zhu
given family
Bo
Zhang
2021-07-01
Proceedings of the 38th International Conference on Machine Learning
139
inproceedings
date-parts
2021
7
1