Publication Details
Overview
 
 
Pengpeng Hu, Nastaran Nourbakhsh, Jing Tian, Stephan Sturges, Vasile Dadarlat, Adrian Munteanu
 

Contribution to journal

Abstract 

Virtual try-on synthesizes garments for the target bodies in 2D/3D domains. Even though existing virtual try-on methodsfocus on redressing garments, the virtual try-on hair, shoes and wearable accessories are still under-reached. In thispaper, we present the first general method for virtual try-ons that is fully automatic and suitable for many items includinggarments, hair, shoes, watches, necklaces, hats, and so on. Starting with the pre-defined wearable items on a referencehuman body model, an automatic method is proposed to deform the reference body mesh to fit a target body forobtaining dense triangle correspondences. Then, an improved fit metric is used to represent the interaction betweenwearable items and the body. For the next step, with the help of triangle correspondences and the fit metric, thewearable items can be fast and efficiently inferred by the shape and posture of the targeted body. Extensive experimentalresults show that, besides automation and efficiency, the proposed method can be easily extended to implement thedynamic try-on by applying rigging and importing motion capture data, being able to handle both tight and loosegarments, and even multi-layer clothing.

Reference 
 
 
DOI  scopus