An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation

Three-dimensional (3D) model segmentation has received tremendous attention in recent years, to partition model mesh into meaningful sub-meshes. Hitherto, there is no robust and consistent segmentation method to overcome the problems of under- and over-segmentation for the meaningful components. Man...

Full description

Saved in:
Bibliographic Details
Main Author: Ng, Kok Why
Format: Thesis
Published: 2016
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-mmu-ep.7147
record_format uketd_dc
spelling my-mmu-ep.71472018-03-16T16:47:13Z An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation 2016-10 Ng, Kok Why TA1501-1820 Applied optics. Photonics Three-dimensional (3D) model segmentation has received tremendous attention in recent years, to partition model mesh into meaningful sub-meshes. Hitherto, there is no robust and consistent segmentation method to overcome the problems of under- and over-segmentation for the meaningful components. Many existing methods require either user-input seeding for the number of segments or to apply minima rules to approximate the meaningful components. Some methods excel only in a narrow range of models. Their methods are vague, sensitive to model shape (unstable) and tedious (duplicated processes). Slinky-based segmentation (SBS) with improved skeleton method is proposed in this thesis to automatically and consistently identify meaningful features of a model. The method is robust on any input model shape. The algorithm begins with voxelization and surface-reconstruction on the input model to get rid of the irregular meshes. Laplacian-based Contraction method is adapted to shrink the model into triangular skeleton. 2016-10 Thesis http://shdl.mmu.edu.my/7147/ http://library.mmu.edu.my/diglib/onlinedb/dig_lib.php phd doctoral Multimedia University Faculty of Computing and Informatics
institution Multimedia University
collection MMU Institutional Repository
topic TA1501-1820 Applied optics
Photonics
spellingShingle TA1501-1820 Applied optics
Photonics
Ng, Kok Why
An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation
description Three-dimensional (3D) model segmentation has received tremendous attention in recent years, to partition model mesh into meaningful sub-meshes. Hitherto, there is no robust and consistent segmentation method to overcome the problems of under- and over-segmentation for the meaningful components. Many existing methods require either user-input seeding for the number of segments or to apply minima rules to approximate the meaningful components. Some methods excel only in a narrow range of models. Their methods are vague, sensitive to model shape (unstable) and tedious (duplicated processes). Slinky-based segmentation (SBS) with improved skeleton method is proposed in this thesis to automatically and consistently identify meaningful features of a model. The method is robust on any input model shape. The algorithm begins with voxelization and surface-reconstruction on the input model to get rid of the irregular meshes. Laplacian-based Contraction method is adapted to shrink the model into triangular skeleton.
format Thesis
qualification_name Doctor of Philosophy (PhD.)
qualification_level Doctorate
author Ng, Kok Why
author_facet Ng, Kok Why
author_sort Ng, Kok Why
title An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation
title_short An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation
title_full An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation
title_fullStr An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation
title_full_unstemmed An Improved Skeletion-Based Technique For Three-Dimensional Model Segmentation
title_sort improved skeletion-based technique for three-dimensional model segmentation
granting_institution Multimedia University
granting_department Faculty of Computing and Informatics
publishDate 2016
_version_ 1747829653344616448