作者: Trevor Cohen , Dominic Widdows , Manuel Wahle , Roger Schvaneveldt
DOI: 10.1007/978-3-642-54943-4_4
关键词: Orthography 、 Natural language processing 、 Semantic data model 、 Orthogonality (programming) 、 Vector space 、 Representation (mathematics) 、 Word (computer architecture) 、 Spelling 、 Mathematics 、 Distributional semantics 、 Artificial intelligence
摘要: This paper explores a new technique for encoding structured information into semantic model, the construction of vector representations words and sentences. As an illustrative application, we use this to compose robust based on sequences letters, that are tolerant changes such as transposition, insertion deletion characters. Since these vectors generated from written form or orthography word, call them 'orthographic vectors'. The representation discrete letters in continuous space is interesting example Generalized Quantum process generating word mathematically similar derivation orbital angular momentum quantum mechanics. importance sometimes, violation orthogonality discussed both mathematical settings. work grounded psychological literature recognition, also motivated by potential technological applications genre-appropriate spelling correction. method, examples experiments, implementation availability Semantic Vectors package discussed.