作者: Paul Smolensky , Hamid Palangi , Kenneth D. Forbus , Jianfeng Gao , Qiuyuan Huang
DOI:
关键词:
摘要: Generating formal-language programs represented by relational tuples, such as Lisp or mathematical operations, to solve problems stated in natural language is a challenging task because it requires explicitly capturing discrete symbolic structural information implicit the input. However, most general neural sequence models do not capture information, limiting their performance on these tasks. In this paper, we propose new encoder-decoder model based structured representation, Tensor Product Representations (TPRs), for mapping Natural-language Formal-language solutions, called TP-N2F. The encoder of TP-N2F employs TPR `binding' encode natural-language structure vector space and decoder uses `unbinding' generate, space, sequential program each consisting relation (or operation) number arguments. considerably outperforms LSTM-based seq2seq two benchmarks creates state-of-the-art results. Ablation studies show that improvements can be attributed use TPRs both decoder. Analysis learned structures shows how enhance interpretability