Daily Arxiv

This page organizes papers related to artificial intelligence published around the world.
This page is summarized using Google Gemini and is operated on a non-profit basis.
The copyright of the paper belongs to the author and the relevant institution. When sharing, simply cite the source.

NSPDI-SNN: An efficient lightweight SNN based on nonlinear synaptic pruning and dendritic integration

Created by
  • Haebom

Author

Wuque Cai, Hongze Sun, Jiayi He, Qianqian Liao, Yunliang Zang, Duo Chen, Dezhong Yao, Daqing Guo

Outline

In this study, inspired by the dendrites of biological neurons, we propose an efficient and lightweight SNN (NSPDI-SNN) that utilizes nonlinear pruning and dendrite integration. This method introduces nonlinear dendrite integration (NDI) to enhance the spatiotemporal information representation of neurons and a novel nonlinear synaptic pruning (NSP) method to achieve high sparsity in the SNN. We evaluate NSPDI-SNN on benchmark datasets such as DVS128 Gesture, CIFAR10-DVS, and CIFAR10, as well as on speech recognition and reinforcement learning-based maze navigation tasks. In all tasks, we achieve high sparsity with minimal performance degradation. In particular, we achieve the best results on the event stream dataset, and synaptic information transfer efficiency significantly improves as sparsity increases.

Takeaways, Limitations

Takeaways:
The complex structure and nonlinear computation of dendrites suggest that it is a promising approach for developing efficient SNNs.
NSPDI-SNN has shown excellent performance while maintaining high sparsity across various tasks (image, speech, and reinforcement learning).
Nonlinear pruning (NSP) improves the efficiency of synaptic information transmission.
It performed particularly well on event stream datasets.
Limitations:
The specific Limitations is not specified in the paper.
👍