This paper proposes Adaptive Context Compression (ACC) to address the high inference cost problem in retrieval-augmented generation (RAG). Unlike existing fixed-compression-ratio methods, ACC-RAG dynamically adjusts the compression ratio according to the complexity of the input query, thereby improving both efficiency and accuracy. It utilizes a hierarchical compressor and a context selector to retain only the minimum necessary information, similar to a human skimming through a text. Experimental results using Wikipedia and five question-answering (QA) datasets show that ACC-RAG outperforms existing fixed-compression-ratio methods and achieves an inference speed that is more than four times faster than standard RAG, while maintaining or improving accuracy.