This paper presents AttentionDSE, a novel design space exploration (DSE) framework for CPU design in high-dimensional design spaces. Existing DSE frameworks suffer from issues such as poor accuracy and scalability of surrogate models, inefficient exploration relying on manual heuristics or exhaustive search, and difficulties in interpretation. AttentionDSE addresses these issues by integrating performance predictions and design guidance through an attention-based neural network architecture. Attention weights perform the dual role of accurately estimating performance while simultaneously exposing performance bottlenecks. Key innovations include a perception-driven attention mechanism that leverages hierarchy and locality (reducing complexity from $\mathcal{O}(n^2)$ to $\mathcal{O}(n)$) and attention-aware bottleneck analysis, which automatically suggests critical parameters for goal-directed optimization. In a high-dimensional CPU design space evaluation using the SPEC CPU2017 benchmark, AttentionDSE achieves up to 3.9% higher Pareto Hypervolume and over 80% faster search time compared to state-of-the-art baseline models.