In this paper, we propose FiSKE, a novel method for integrating external knowledge bases such as knowledge graphs to address the knowledge shortage problem of large-scale language models (LLMs). While existing methods target the entire question and incrementally retrieve relevant knowledge from the knowledge graph, FiSKE decomposes the question into fine-grained clues and resolves ambiguities between clues and the graph through an adaptive mapping strategy. It balances accuracy and efficiency by leveraging perfectly mapped paths to LLMs through a clue-based termination mechanism and falling back to chain-of-thought inference when necessary. Experimental results on multiple datasets show that FiSKE outperforms existing state-of-the-art methods in knowledge retrieval performance and significantly reduces the number of LLM calls.