作者: Colin Raffel , Daniel P. W. Ellis
DOI:
关键词:
摘要: We propose a simplified model of attention which is applicable to feed-forward neural networks and demonstrate that the resulting model can solve the synthetic" addition" and" multiplication" long-term memory problems for sequence lengths which are both longer and more widely varying than the best published results for these tasks.