作者: Miikkulainen , Dyer
DOI: 10.1109/IJCNN.1989.118677
关键词: Time delay neural network 、 Artificial intelligence 、 Paraphrase 、 Artificial neural network 、 Word (computer architecture) 、 Modular neural network 、 Natural language processing 、 Computer science 、 Lexicon 、 Recurrent neural network 、 Natural language
摘要: Sequential recurrent neural networks have been applied to a fairly high-level cognitive task, i.e. paraphrasing script-based stories. Using hierarchically organized modular subnetworks, which are trained separately and in parallel, the complexity of task is reduced by effectively dividing it into subgoals. The system uses sequential natural language input output develops its own I/O representations for words. stored an external global lexicon adjusted course training all four subnetworks simultaneously, according FGREP-method. By concatenating unique identification with resulting representation, arbitrary number instances same word type can be created used able produce fully expanded paraphrase story from only few sentences, unmentioned events inferred. correctly bound their roles, simple plausible inferences variable content made process. >