Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Hyperflux: Pruning Reveals the Importance of Weights

Created by
  • Haebom

Author

Eugen Barbulescu, Antonio Alexoaie, Lucian Busoniu

Outline

This paper proposes Hyperflux, a network pruning technique to reduce the inference latency and power consumption of neural networks. While existing pruning methods rely primarily on empirical results, Hyperflux is a conceptually robust L0 pruning approach that estimates the importance of each weight as the gradient response (flux) to weight removal. A global pressure term continuously guides all weights toward pruning, and weights critical for accuracy automatically regrow according to the flux. In this paper, we present and experimentally validate several properties naturally derived from the Hyperflux framework and design a sparsity-controlled scheduler by deriving a generalized scaling law equation describing the relationship between final sparsity and pressure. Experimental results demonstrate state-of-the-art results on the CIFAR-10 and CIFAR-100 datasets using ResNet-50 and VGG-19.

Takeaways, Limitations

Takeaways:
Unlike existing empirical methods, we present a conceptually clear L0 pruning method that estimates the importance of weights through flux.
Efficient sparsity control is achieved by deriving a generalized scaling law equation that describes the relationship between sparsity and pressure.
Experiments using the CIFAR-10 and CIFAR-100 datasets with ResNet-50 and VGG-19 models demonstrate state-of-the-art performance.
Limitations:
The effectiveness of the proposed method may be limited to specific network structures and datasets.
Generalization performance needs to be verified on other larger networks or datasets.
The computational cost of calculating flux can be relatively high.
👍