In this paper, we propose Danceba, a novel framework for automatically generating natural, diverse, and rhythmic human dance movements to music. To address the limitations of existing methods, such as lack of beat alignment and unnatural movement dynamics, Danceba improves the representation of rhythm recognition features by utilizing a gating mechanism. Specifically, we propose Phase-Based Rhythm Extraction (PRE), which accurately extracts rhythm information from the phase data of music, and Temporal-Gated Causal Attention (TGCA), which focuses on global rhythm features to ensure that dance movements follow the music rhythm accurately. In addition, we enhance the naturalness and diversity of the generated dance movements through the Parallel Mamba Motion Modeling (PMMM) architecture, which separately models upper and lower body motions along with musical features. Experimental results show that Danceba significantly outperforms existing state-of-the-art methods in terms of rhythm alignment and movement diversity.