This is a page that curates AI-related papers published worldwide. All content here is summarized using Google Gemini and operated on a non-profit basis. Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.
ETCH: Generalizing Body Fitting to Clothed Humans via Equivariant Tightness
Created by
Haebom
Author
Boqian Li, Haiwen Feng, Zeyu Cai, Michael J. Black, Yuliang Xiu
Outline
This paper addresses the problem of fitting a body to a point cloud of a 3D clothed human. Existing optimization-based methods employ multi-stage pipelines, making them susceptible to pose initialization, while recent learning-based methods struggle to generalize across diverse poses and clothing types. In this paper, we propose Equivariant Tightness Fitting for Clothed Humans (ETCH), a novel pipeline that estimates the clothing-to-body surface mapping using a locally approximated SE(3) equivariance and encodes the tightness as a displacement vector from the body surface. Following this mapping, pose-invariant body features are regressed on sparse body markers, simplifying the clothed human fitting task to an internal body marker fitting task. Extensive experiments on CAPE and 4D-Dress demonstrate that ETCH significantly outperforms state-of-the-art methods (both with and without tightness considerations) in body fitting accuracy (16.7% to 69.5%) and shape accuracy (average 49.9%) for loose-fitting clothing. Equilateral tight fit design can reduce orientation errors by 67.2% to 89.8% in one-shot (or out-of-distribution) settings (~1% of data). Qualitative results demonstrate the robust generalization of ETCH regardless of difficult poses, invisible shapes, loose clothing, and non-rigid dynamics. The code and model will be publicly available under https://boqian-li.github.io/ETCH/에서 .
Takeaways, Limitations
•
Takeaways:
◦
SE(3) Proposal of a new body fitting pipeline (ETCH) for people wearing clothes using isomorphism.
◦
Significantly improved body fitting accuracy and shape accuracy for loose clothing compared to existing methods.
◦
Strong generalization performance across a variety of postures, clothing types, and non-rigid body dynamics.
◦
Significantly reduces directional error in one-shot settings.
◦
Code and model to be released soon.
•
Limitations:
◦
The code and model are not yet public.
◦
Further validation of the diversity of the experimental dataset is needed.
◦
Possibility of generalized performance degradation for specific types of clothing or body shapes.