Artificial Neural Networks (ANNs) are bio-inspired models of neural computation that have proven highly effective. Still, ANNs lack a natural notion of time, and neural units in ANNs exchange analog values in a frame-based manner, a computationally and energetically inefficient form of communication. This contrasts sharply with biological neurons that communicate sparingly and efficiently using isomorphic binary spikes. While Spiking Neural Networks (SNNs) can be constructed by replacing the units of an ANN with spiking neurons (Cao et al., 2015; Diehl et al., 2015) to obtain reasonable performance, these SNNs use Poisson spiking mechanisms with exceedingly high firing rates compared to their biological counterparts. Here we show how spiking neurons that employ a form of neural coding can be used to construct SNNs that match high-performance ANNs and match or exceed state-of-the-art in SNNs on important benchmarks, while requiring firing rates compatible with biological findings. For this, we use spike-based coding based on the firing rate limiting adaptation phenomenon observed in biological spiking neurons. This phenomenon can be captured in fast adapting spiking neuron models, for which we derive the effective transfer function. Neural units in ANNs trained with this transfer function can be substituted directly with adaptive spiking neurons, and the resulting Adaptive SNNs (AdSNNs) can carry out competitive classification in deep neural networks without further modifications. Adaptive spike-based coding additionally allows for the dynamic control of neural coding precision: we show empirically how a simple model of arousal in AdSNNs further halves the average required firing rate and this notion naturally extends to other forms of attention as studied in neuroscience. AdSNNs thus hold promise as a novel and sparsely active model for neural computation that naturally fits to temporally continuous and asynchronous applications.

, , , ,
doi.org/10.3389/fnins.2018.00987
Frontiers in Neuroscience
Deep Spiking Vision: Better, Faster, Cheaper

Zambrano, D., Nusselder, R., Scholte, S., & Bohte, S. (2019). Sparse computation in adaptive spiking neural networks. Frontiers in Neuroscience, 12, 987:1–987:11. doi:10.3389/fnins.2018.00987