Skip to Main content Skip to Navigation
Conference papers

Low-activity supervised convolutional spiking neural networks applied to speech commands recognition

Abstract : Deep Neural Networks (DNNs) are the current state-of-the-art models in many speech related tasks. There is a growing interest, though, for more biologically realistic , hardware friendly and energy efficient models, named Spiking Neural Networks (SNNs). Recently, it has been shown that SNNs can be trained efficiently, in a supervised manner, using backpropagation with a surrogate gradient trick. In this work, we report speech command (SC) recognition experiments using supervised SNNs. We explored the Leaky-Integrate-Fire (LIF) neuron model for this task, and show that a model comprised of stacked dilated convolution spik-ing layers can reach an error rate very close to standard DNNs on the Google SC v1 dataset: 5.5%, while keeping a very sparse spiking activity, below 5%, thank to a new regularization term. We also show that modeling the leakage of the neuron membrane potential is useful, since the LIF model outperformed its non-leaky model counterpart significantly.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-03007620
Contributor : Thomas Pellegrini <>
Submitted on : Monday, November 16, 2020 - 2:23:01 PM
Last modification on : Thursday, January 28, 2021 - 3:53:46 PM
Long-term archiving on: : Wednesday, February 17, 2021 - 7:13:34 PM

File

final_PAPER_SLT_2021.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03007620, version 1
  • ARXIV : 2011.06846

Citation

Thomas Pellegrini, Romain Zimmer, Timothée Masquelier. Low-activity supervised convolutional spiking neural networks applied to speech commands recognition. IEEE Spoken Language Technology Workshop 2021, Jan 2021, Shenzhen (virtual), France. ⟨hal-03007620⟩

Share

Metrics

Record views

82

Files downloads

15