Regression Bayesian Network


Goal: learning a deep directed generative model with multiple layers of latent variables. Propose a new building block (RBN) to construct the deep model.

Motivation: the latent variables in a directed model are dependent on each other given observations. The dependencies can help better explain the patterns in input data.


Proposed methods

  • Learning: propose the stochastic approximation procedure for gradient ascent learning. Learning consists of layerwise learning and global fine-tuning. Learning can be performed in an unsupervised manner, or supervised manner. Supervised learning includes generative and discriminative exact method that combines the integer programming formulation.
  • Inference: we propose the pseudo-likelihood for posterior probability inference, and the autmented coordinate ascent for maximum a posteriori inference.
  • Applications: we evaluate the learning and inference algorithm on various computer vision tasks including digit modeling, image restoration, face reconstruction, action recognition, head pose estimation, and eye gaze estimation.
  • Publications

    Quan Gan, Siqi Nie, Shangfei Wang, and Qiang Ji, "Differentiating between Posed and Spontaneous Expressions with Latent Regression Bayesian Network," in AAAI Conference on Artificial Intelligence (AAAI), 2017. To appear.

    Siqi Nie and Qiang Ji, "Latent Regression Bayesian Network for Data Representation," in Proceedings of the 23rd International Conference on Pattern Recognition (ICPR), 2016.