作者: Stephan Gouws , Anders Søgaard
DOI: 10.3115/V1/N15-1157
关键词:
摘要: We introduce a simple wrapper method that uses off-the-shelf word embedding algorithms to learn task-specific bilingual embeddings. use small dictionary of easily-obtainable equivalence classes produce mixed context-target pairs we train models. Our model has the advantage it (a) is independent choice algorithm, (b) does not require parallel data, and (c) can be adapted specific tasks by re-defining classes. show how our outperforms embeddings on task unsupervised cross-language partof-speech (POS) tagging, as well semi-supervised super sense (SuS) tagging.