Daily Arxiv

This page organizes papers related to artificial intelligence published around the world.
This page is summarized using Google Gemini and is operated on a non-profit basis.
The copyright of the paper belongs to the author and the relevant institution. When sharing, simply cite the source.

Flow-Induced Diagonal Gaussian Processes

Created by
  • Haebom

Author

Moule Lin, Andrea Patane, Weipeng Jing, Shuhao Guan, Goetz Botterweck

Outline

FiD-GP is a compression framework that incorporates compact guided weight matrices to project the weight uncertainty of a neural network into a low-dimensional subspace. It enhances expressiveness through a regularization flow dictionary and spectral normalization, and aligns the guided subspace with the feature-gradient geometry through a numerically stable projection mechanism. Using FiD-GP's prediction framework, we can design a single-pass projection for Out-of-Distribution (OoD) detection. It improves uncertainty estimation performance compared to SVGP-based baselines on regression, image classification, semantic segmentation, and OoD detection benchmarks, satisfies tight spectral residual bounds with theoretically guaranteed OoD detection, and significantly compresses the storage requirements of neural networks.

Takeaways, Limitations

Reduce Bayesian training cost in several steps.
Compresses parameters by about 51%.
Reduces model size by approximately 75%.
Consistent with state-of-the-art accuracy and uncertainty estimates.
Inference computation increases with the number of induction weights.
👍