Daily Arxiv

This page organizes papers related to artificial intelligence published around the world.
This page is summarized using Google Gemini and is operated on a non-profit basis.
The copyright of the paper belongs to the author and the relevant institution. When sharing, simply cite the source.

Synaptic Pruning: A Biological Inspiration for Deep Learning Regularization

Created by
  • Haebom

Author

Gideon Vos, Liza van Eijk, Zoltan Sarnyai, Mostafa Rahimi Azghadi

Outline

Inspired by synaptic pruning in the biological brain, we propose a size-based synaptic pruning method that progressively removes low-importance connections during training. It can be applied to various time-series prediction models, including RNNs, LSTMs, and Patch Time Series Transformers, replacing dropout and is directly integrated into the training loop. Weight importance is calculated based on absolute size, and a cubic schedule is used to progressively increase global sparsity. By periodically and permanently removing low-importance weights and maintaining gradient flow for active weights, we eliminate the need for separate pruning and fine-tuning steps.

Takeaways, Limitations

Takeaways:
A novel regularization technique that improves efficiency by mimicking biological synaptic pruning is proposed.
Improving performance in various time series forecasting models by replacing dropout.
Up to 20% reduction in MAE in financial forecasting, up to 52% reduction in some Transformer models.
Integrated directly into your training loop for ease of use
No need for separate pruning and fine-tuning steps
Limitations:
Performance may vary depending on the specific model architecture or dataset characteristics.
Further research on generalization performance is needed.
Does not perfectly mimic pruning in the biological brain (e.g., activity-dependent pruning)
👍