Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Learning with Spike Synchrony in Spiking Neural Networks

Created by
  • Haebom

Author

Yuchen Tian, Assel Kembay, Samuel Tensingh, Nhan Duy Truong, Jason K. Eshraghian, Omid Kavehei

Outline

This paper highlights the problem that conventional spiking neural network (SNN) learning rules focus solely on individual spike pairs, failing to leverage the synchronized activity patterns that drive learning in biological systems. We propose a novel learning method, Spike Synchronization-Dependent Plasticity (SSDP), which adjusts synaptic weights based on the degree of synchronization of neural firing, rather than the temporal order of spikes. SSDP operates as a local, post-optimization mechanism, applying updates to a sparse subset of parameters while maintaining computational efficiency through linear scaling. Furthermore, it seamlessly integrates with standard backpropagation while preserving the forward computational graph. Experiments on a variety of datasets, from static images to high-temporal resolution tasks, demonstrate improved convergence stability and robustness to spike-time jitter and event noise, demonstrating a single-layer SNN and spiking transformers. This provides new insights into how biological neural networks can leverage synchronized activity for efficient information processing, suggesting that synchronization-dependent plasticity is a fundamental computational principle of neural learning.

Takeaways, Limitations

Takeaways:
Providing new insights into efficient information processing by leveraging the synchronous activity of biological neural networks.
We propose that synchronization-dependent plasticity is a core computational principle of neural learning.
Improved convergence stability and robustness to spike time jitter and event noise of SNNs through SSDP.
Seamless integration with existing backpropagation-based learning and maintaining computational efficiency.
Limitations:
Further validation of the generalization performance of the proposed method is needed.
Extensive experiments on various network structures and datasets are needed.
Further research is needed to accurately map this to biological neural networks.
👍