Spiking neural networks are being investigated both as biologically plausible models of neural computation and also as a potentially more efficient type of neural network. Recurrent neural networks in the form of networks of gating memory cells have been central in state-of-the-art solutions in problem domains that involve sequence recognition or generation. Here, we design an analog Long Short-Term Memory (LSTM) cell where its neurons can be substituted with efficient spiking neurons, where we use subtractive gating (following the subLSTM in [1]) instead of multiplicative gating. Subtractive gating allows for a less sensitive gating mechanism, critical when using spiking neurons. By using fast adapting spiking neurons with a smoothed Rectified Linear Unit (ReLU)-like effective activation function, we show that then an accurate conversion from an analog subLSTM to a continuous-time spiking subLSTM is possible. This architecture results in memory networks that compute very efficiently, with low average firing rates comparable to those in biological neurons, while operating in continuous time.

Additional Metadata
Keywords LSTM, Recurrent neural networks, Reinforcement learning, Spiking neurons, Supervised learning
Publisher Springer, Cham
Editor V. Kůrková , Y. Manolopoulos , B. Hammer , I. Maglogiannis
Persistent URL dx.doi.org/10.1007/978-3-030-01418-6_28
Series Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence , Theoretical Computer Science and General Issues, LNTCS 11139
Project Deep Spiking Vision: Better, Faster, Cheaper
Conference Artificial Neural Networks and Machine Learning - ICANN 2018
Pozzi, I, Nusselder, R.B.P, Zambrano, D, Bohte, S.M, & Iliadis, L. (2018). Gating sensory noise in a spiking subtractive LSTM. In V Kůrková, Y Manolopoulos, B Hammer, & I Maglogiannis (Eds.), Proceedings of Artificial Neural Networks and Machine Learning - ICANN 2018 (pp. 284–293). Springer, Cham. doi:10.1007/978-3-030-01418-6_28