作者: Charles Sutton , Khashayar Rohanimanesh , Andrew McCallum
关键词: Belief propagation 、 Pattern recognition 、 Inference 、 Conditional random field 、 Bayesian network 、 Variable elimination 、 Computer science 、 Probabilistic logic 、 Artificial intelligence 、 Dynamic Bayesian network 、 Approximate inference 、 Graphical model 、 Training set
摘要: In sequence modeling, we often wish to represent complex interaction between labels, such as when performing multiple, cascaded labeling tasks on the same sequence, or long-range dependencies exist. We present dynamic conditional random fields (DCRFs), a generalization of linear-chain (CRFs) in which each time slice contains set state variables and edges---a distributed representation Bayesian networks (DBNs)---and parameters are tied across slices. Since exact inference can be intractable models, perform approximate using several schedules for belief propagation, including tree-based reparameterization (TRP). On natural-language chunking task, show that DCRF performs better than series CRFs, achieving comparable performance only half training data.