[PDF][PDF] Pattern-Based Cloth Registration and Sparse-View Animation.

O Halimi, T Stuyck, D Xiang, TM Bagautdinov… - ACM Trans …, 2022 - academia.edu
ACM Trans. Graph., 2022academia.edu
We introduce a novel approach for capturing full garments in motion with high accuracy. Our
cloth registration pipeline operates in a multi-view camera setting and produces a dynamic
sequence of registered cloth meshes with a registration resolution of 2.67 millimeters and
triangulation localization error of 1 millimeter, greatly improving upon previous work. Most
notably, compared to state-ofthe-art cloth registration methods [Pons-Moll et al. 2017; Xiang
et al. 2021], our approach produces temporally stable texture coordinates allowing to track a …
We introduce a novel approach for capturing full garments in motion with high accuracy. Our cloth registration pipeline operates in a multi-view camera setting and produces a dynamic sequence of registered cloth meshes with a registration resolution of 2.67 millimeters and triangulation localization error of 1 millimeter, greatly improving upon previous work. Most notably, compared to state-ofthe-art cloth registration methods [Pons-Moll et al. 2017; Xiang et al. 2021], our approach produces temporally stable texture coordinates allowing to track a specific surface point with negligible drift in the point’s identity between different frames. This accurate registration is achieved using a novel pattern printed on the cloth, designed to optimize the ratio of pixel area to the number of uniquely registered cloth surface points. Dense pattern primitives are localized using a specialized image-based detector and each pattern keypoint is identified using a graph processing algorithm that robustly handles the combination of non-rigid and projective transformations, self-occlusions, and detection noise. Note that our method does not assume any human body model and treats the cloth as a generic non-rigid surface. On the other hand, ClothCap [Pons-Moll et al. 2017] is built on SMPL [Loper et al. 2015a], and [Xiang et al. 2021] uses proprietary LBS-based body model, which generally leads to decreased tracking performance when the cloth deviates significantly from the body.
Our motivation for developing an accurate cloth registration method is that many cloth-related computer vision tasks that target realistic cloth appearance could benefit from precisely registered mesh sequences. One major challenge in representing clothing comes from the lack of high-quality cloth registration of the stretching and shearing of the fabric on moving bodies, which is notoriously difficult due to the numerous self-occlusions and the lack of visual features for alignment. Previous work have made efforts in capturing simple cloth swatches under external forces in a controlled environment [Bhat et al. 2003; Clyde et al. 2017; Miguel et al. 2012; Rasheed et al. 2020; Wang et al. 2011]. However, these captures only consider isolated suspended fabrics without capturing the combined effects of friction, air drag, external forces or the interaction of garments being worn with the underlying body. While synthetic thin shell simulations have made drastic progress in recent decades [Stuyck 2018], a simulation-to-real gap remains. Our high-quality capturing of the fabric’s dynamics could help bridge this gap. Applications that could benefit from the proposed registration method include enhancing cloth physical simulations [Jin et al. 2020; Runia et al. 2020], facilitating neural models for cloth dynamics [Holden et al. 2019; Lahner et al. 2018; Liang et al. 2019; Patel
academia.edu