Multivariate-Information Adversarial Ensemble for Scalable Joint Distribution Matching

作者: Xiaodan Liang , Liang Lin , Xiaopeng Yan , Guanbin Li , Zhanfu Yang

DOI:

关键词:

摘要: A broad range of cross-$m$-domain generation researches boil down to matching a joint distribution by deep generative models (DGMs). Hitherto algorithms excel in pairwise domains while as $m$ increases, remain struggling scale themselves fit distribution. In this paper, we propose domain-scalable DGM, i.e., MMI-ALI for $m$-domain matching. As an ensemble model ALIs \cite{dumoulin2016adversarially}, is adversarially trained with maximizing Multivariate Mutual Information (MMI) w.r.t. variables each pair and their shared feature. The negative MMIs are upper bounded series feasible losses that provably lead distributions. linearly scales increases thus, strikes right balance between efficacy scalability. We evaluate diverse challenging scenarios verify its superiority.

参考文章(41)
Anthony J. Bell, THE CO-INFORMATION LATTICE ,(2003)
Nick Craswell, Mean Reciprocal Rank. Encyclopedia of Database Systems. pp. 1703- ,(2009)
Pat Langley, Crafting Papers on Machine Learning international conference on machine learning. pp. 1207- 1216 ,(2000)
, Generative Adversarial Nets neural information processing systems. ,vol. 27, pp. 2672- 2680 ,(2014) , 10.3156/JSOFT.29.5_177_2
Z. Wang, A.C. Bovik, H.R. Sheikh, E.P. Simoncelli, Image quality assessment: from error visibility to structural similarity IEEE Transactions on Image Processing. ,vol. 13, pp. 600- 612 ,(2004) , 10.1109/TIP.2003.819861
A. L. Samuel, Some studies in machine learning using the game of checkers Ibm Journal of Research and Development. ,vol. 44, pp. 206- 226 ,(2000) , 10.1147/RD.441.0206
W. McGill, Multivariate information transmission IRE Professional Group on Information Theory. ,vol. 4, pp. 93- 111 ,(1954) , 10.1109/TIT.1954.1057469
Nikunj C. Oza, Online Ensemble Learning national conference on artificial intelligence. pp. 1109- ,(2000)