Hierarchical Image Segmentation Based on Nonsymmetric and Anti Packing Pattern Representation Model

Hierarchical Image Segmentation Based on Nonsymmetric and Anti Packing Pattern Representation Model

Abstract:

Image segmentation is the foundation of high-level image analysis and image understanding. How to effectively segment an image into regions that are “meaningful” to the human visual perception and ensure that the segmented regions are consistent at different resolutions is still a very challenging issue. Inspired by the idea of the Nonsymmetry and Anti-packing pattern representation Model in the Lab color space (NAMLab) and the “global-first” invariant perceptual theory, in this paper, we propose a novel framework for hierarchical image segmentation. Firstly, by defining the dissimilarity between two pixels in the Lab color space, we propose an NAMLab-based color image representation approach that is more in line with the human visual perception characteristics and can make the image pixels fast and effectively merge into the NAMLab blocks. Then, by defining the dissimilarity between two NAMLab-based regions and iteratively executing NAMLab-based merging algorithm of adjacent regions into larger ones to progressively generate a segmentation dendrogram, we propose a fast NAMLab-based algorithm for hierarchical image segmentation. Finally, the complexities of our proposed NAMLab-based algorithm for hierarchical image segmentation are analyzed in details. The experimental results presented in this paper show that our proposed algorithm when compared with the state-of-the-art algorithms not only can preserve more details of the object boundaries, but also it can better identify the foreground objects with similar color distributions. Also, our proposed algorithm can be executed much faster and takes up less memory and therefore it is a better algorithm for hierarchical image segmentation.