Neurocomputing. 2019 Feb 28;331:281-288. doi: 10.1016/j.neucom.2018.11.066. Epub 2018 Nov 29.
Neurocomputing
Jared Ostmeyer, Lindsay Cowell
PMID: 30799908 PMCID: PMC6380500 DOI: 10.1016/j.neucom.2018.11.066
Recurrent Neural Networks (RNN) are a type of statistical model designed to handle sequential data. The model reads a sequence one symbol at a time. Each symbol is processed based on information collected from the previous symbols. With existing RNN architectures, each symbol is processed using only information from the previous processing step. To overcome this limitation, we propose a new kind of RNN model that computes a recurrent weighted average (RWA) over every past processing step. Because the RWA can be computed as a running average, the computational overhead scales like that of any other RNN architecture. The approach essentially reformulates the
Keywords: Attention Mechanism; Recurrent Neural Network; Sequences