Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

NSPDI-SNN: An efficient lightweight SNN based on nonlinear synaptic pruning and dendritic integration

Created by
  • Haebom

Author

Wuque Cai, Hongze Sun, Jiayi He, Qianqian Liao, Yunliang Zang, Duo Chen, Dezhong Yao, Daqing Guo

Outline

In this paper, inspired by the complex dendritic structure of biological neurons, we propose NSPDI-SNN, an efficient and lightweight SNN method that incorporates nonlinear dendritic integration (NDI) and nonlinear synaptic pruning (NSP). NDI enhances the representation of spatiotemporal information in neurons, and NSP achieves high sparsity in SNNs. We conduct experiments on the DVS128 Gesture, CIFAR10-DVS, and CIFAR10 datasets, speech recognition, and reinforcement learning-based maze navigation tasks. In all tasks, we achieve high sparsity with minimal performance degradation. In particular, we achieve the best results on three event stream datasets, demonstrating that NSPDI significantly improves the efficiency of synaptic information transfer as sparsity increases. In conclusion, the complex structure of neuron dendrites and nonlinear computation demonstrate that NSPDI offers a promising approach for developing efficient SNN methods.

Takeaways, Limitations

Takeaways:
We present an efficient SNN implementation method that mimics the dendritic structure of biological neurons.
Achieving both high sparsity and excellent performance through nonlinear dendrite integration and nonlinear synaptic pruning.
Effective performance verification for various tasks (event stream data processing, speech recognition, reinforcement learning).
Analytical confirmation of improved synaptic information transmission efficiency due to increased sparsity.
Limitations:
Further research is needed on the generalization performance of the proposed method.
Experimental results on more complex and larger datasets are needed.
The possibility that it may not perfectly reflect all aspects of biological neuron dendritic structure.
👍