Daily Arxiv

This page organizes papers related to artificial intelligence published around the world.
This page is summarized using Google Gemini and is operated on a non-profit basis.
The copyright of the paper belongs to the author and the relevant institution. When sharing, simply cite the source.

FusedANN: Convexified Hybrid ANN via Attribute-Vector Fusion

Created by
  • Haebom

Author

Alireza Heidari, Wei Zhang, Ying Xiong

Outline

FusedANN is a geometric framework for hybrid queries that combines attribute filters and vector similarity. This framework elevates attributes to Approximate Nearest Neighbor (ANN) optimization constraints and introduces a convex fusion space through Lagrangian-like relaxation. Transformer-based convexification combines attributes and vectors, converting hard filters into continuous weighted penalties to preserve top-k semantics and enabling efficient approximate search. FusedANN reduces to exact filtering under high selectivity and smoothly relaxes to the semantically closest attribute when exact matches are insufficient, preserving the downstream ANN alpha-approximation guarantee. FusedANN delivers up to 3x higher throughput and improved recall compared to existing hybrid systems.

Takeaways, Limitations

Takeaways:
Hybrid search query performance improvements: Outperforms legacy systems (up to 3x throughput improvement).
Integrate the filtering step into ANN optimization: no special index tuning required.
Flexibility between precise and relaxed filtering: suitable for both high and low selectivity scenarios.
Providing theoretical error bounds and parameter selection rules to improve practical applicability.
Limitations:
Limitations in the paper itself is not directly presented (not included in the paper)
Potential computational complexity of Transformer-based embeddings (not specifically mentioned).
👍