Please use this identifier to cite or link to this item:
|Title:||Variational mesh decomposition||Authors:||Zhang, Juyong
|Keywords:||DRNTU::Engineering::Computer science and engineering||Issue Date:||2012||Source:||Zhang, J., Zheng, J., Wu, C., & Cai, J. (2012). Variational mesh decomposition. ACM Transactions on Graphics, 31(3).||Series/Report no.:||ACM transactions on graphics||Abstract:||The problem of decomposing a 3D mesh into meaningful segments (or parts) is of great practical importance in computer graphics. This article presents a variational mesh decomposition algorithm that can efficiently partition a mesh into a prescribed number of segments. The algorithm extends the Mumford-Shah model to 3D meshes that contains a data term measuring the variation within a segment using eigenvectors of a dual Laplacian matrix whose weights are related to the dihedral angle between adjacent triangles and a regularization term measuring the length of the boundary between segments. Such a formulation simultaneously handles segmentation and boundary smoothing, which are usually two separate processes in most previous work. The efficiency is achieved by solving the Mumford-Shah model through a saddle-point problem that is solved by a fast primal-dual method. A preprocess step is also proposed to determine the number of segments that the mesh should be decomposed into. By incorporating this preprocessing step, the proposed algorithm can automatically segment a mesh into meaningful parts. Furthermore, user interaction is allowed by incorporating the user's inputs into the variational model to reflect the user's special intention. Experimental results show that the proposed algorithm outperforms competitive segmentation methods when evaluated on the Princeton Segmentation Benchmark.||URI:||https://hdl.handle.net/10356/98038
|ISSN:||0730-0301||DOI:||10.1145/2167076.2167079||Rights:||© 2012 ACM.||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||SCSE Journal Articles|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.