In this paper, we present Garment Inertial Poser (GaIP), a novel method for estimating full-body pose from a small number of inertial measurement units (IMUs) loosely attached to clothing. Existing IMU-based motion capture methods assume that the IMUs are firmly attached to the body, but this assumption is not always true in real-world situations. GaIP simulates IMU measurements using existing clothing-based human motion datasets and estimates human pose from loosely attached IMU data using a transformer-based diffusion model. Specifically, by incorporating clothing-related parameters into the learning process, the proposed method effectively captures variations in clothing looseness or tightness, maintaining expressiveness. Experimental results demonstrate quantitative and qualitative superiority over existing state-of-the-art methods, opening up new possibilities for motion capture research in realistic sensor deployment environments.