Spiking neural networks are being investigated both as biologically plausible models of neural computation and also as a potentially more efficient type of neural network. Recurrent neural networks in the form of networks of gating memory cells have been central in state-of-the-art solutions in problem domains that involve sequence recognition or generation. Here, we design an analog Long Short-Term Memory (LSTM) cell where its neurons can be substituted with efficient spiking neurons, where we use subtractive gating (following the subLSTM in [1]) instead of multiplicative gating. Subtractive gating allows for a less sensitive gating mechanism, critical when using spiking neurons. By using fast adapting spiking neurons with a smoothed Rectified Linear Unit (ReLU)-like effective activation function, we show that then an accurate conversion from an analog subLSTM to a continuous-time spiking subLSTM is possible. This architecture results in memory networks that compute very efficiently, with low average firing rates comparable to those in biological neurons, while operating in continuous time.

, , , ,
V. Kůrková , Y. Manolopoulos , B. Hammer , I. Maglogiannis
Springer, Cham
doi.org/10.1007/978-3-030-01418-6_28
Lecture Notes in Computer Science/Lecture Notes in Artificial Intelligence , Theoretical Computer Science and General Issues, LNTCS 11139
Deep Spiking Vision: Better, Faster, Cheaper
Artificial Neural Networks and Machine Learning - ICANN 2018
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Pozzi, I., Nusselder, R., Zambrano, D., Bohte, S., & Iliadis, L. (2018). Gating sensory noise in a spiking subtractive LSTM. In V. Kůrková, Y. Manolopoulos, B. Hammer, & I. Maglogiannis (Eds.), Proceedings of Artificial Neural Networks and Machine Learning - ICANN 2018 (pp. 284–293). Springer, Cham. doi:10.1007/978-3-030-01418-6_28