Query by Pictorial Example

Abstract: 
Appearance of real scenes is highly dependent on actual conditions as illumination and viewpoint, which significantly complicates automatic analysis of images of such scenes. In this thesis, we introduce novel textural features, which are suitable for robust recognition of natural and artificial materials (textures) present in real scenes. These features are based on efficient modelling of spatial relations by a type of Markov Random Field (MRF) model and we proved that they are invariant to illumination colour, cast shadows, and texture rotation. Moreover, the features are robust to illumination direction and degradation by Gaussian noise, they are also related to human perception of textures.
retrieval screenshot
The features were favourably tested on current textural databases (Outex, Bonn BTF, CUReT, ALOT, and KTH-TIPS2), where they outperformed state of the art methods (including opponent Gabor features, LBP, LBP-HF, and MR8-LINC) in almost all experiments, e.g. the results on ALOT dataset was improved by 20%. We applied our features in construction of a content-based tile retrieval system, optimisation of texture compression parameters in accordance with human perception, invariant segmentation of multimodal textures, and recognition of glaucomatous tissue in retina images.

The presented methods can improve existing CBIR systems or they can be utilised in domain specific CBIR systems focused on structural/textural similarity. Other possible applications include computer vision, since the analysis of real scenes often requires a recognition of textures in variable conditions.


Online demonstrations:
The input image is selected simply by clicking on it and the system finds the other most similar images. The images are considered to be similar if their structure is similar, regardless to variations of acquisition conditions (e.g. illumination colour - specified on the right). Comparison with alternative state-of-the-art methods can be explored by clicking on the "settings" button. The first demonstration explores classification performance (showing the nearest training images), while the others focus on retrieval of similar images from the database.
Rotation and illumination invariance
(ALOT dataset)
 
Illumination spectrum/colour invariance
(Outex dataset)
 
Illumination direction robustness
(CUReT dataset)
 
Content-Based Tile Retrieval System
(Sanita.cz dataset)
 
Reference: