Robust reversible finite-state approach to contextual generation and semantic parsing

作者: Marc Dymetman , Sriram Venkatapathy , Chunyang Xiao

DOI:

关键词:

摘要: A system and method permit analysis generation to be performed with the same reversible probabilistic model. The model includes a set of factors, including canonical factor, which is function logical form realization thereof, similarity text string surface string, language static context dynamic semantic form. When performing generation, factor are composed receive as input output when analysis, take

参考文章(8)
Günter Neumann, Gertjan Noord, Reversibility and Self-Monitoring in Natural Language Generation Springer, Boston, MA. pp. 59- 95 ,(1994) , 10.1007/978-1-4615-2722-0_3
Ehud Reiter, Has a consensus NL generation architecture appeared, and is it psycholinguistically plausible? Proceedings of the Seventh International Workshop on Natural Language Generation - INLG '94. pp. 163- 170 ,(1994) , 10.3115/1641417.1641436
Kevin Humphreys, Mike Calcagno, David Weise, Reusing a statistical language model for generation natural language generation. pp. 1- 6 ,(2001) , 10.3115/1117840.1117852
Juergen Fritsch, Automated extraction of semantic content and generation of a structured document from speech Journal of the Acoustical Society of America. ,vol. 127, pp. 1178- ,(2005) , 10.1121/1.3326925
Feng Rao, Shuai Yue, Bo Chen, Li Lu, Xiang Zhang, Dadong Xie, Method and system for automatic speech recognition ,(2013)
Nitin Madnani, Martin Chodorow, Joel Tetreault, Round-Trip Translation for Automated Grammatical Error Correction ,(2014)