Institute of Information Theory and Automation

You are here

Query by Pictorial Example

Pavel Vácha
Defense type: 
Date of Event: 
MFF UK, Malostranské nám., Malá aula
Ongoing expansion of digital images requires new methods for sorting, browsing, and searching through huge image databases. This is a domain of Content-Based Image Retrieval (CBIR) systems, which are database search engines for images. A user typically submit a query image or series of images and the CBIR system tries to find and to retrieve the most similar images from the database. Optimally, the retrieved images should not be sensitive to circumstances during their acquisition. Unfortunately, the appearance of natural objects and materials is highly illumination and viewpoint dependent. This work focuses on representation and retrieval of homogeneous images, called textures, under the circumstances with variable illumination and texture rotation. We propose a novel illumination invariant textural features based on Markovian modelling of spatial texture relations. The texture is modelled by Causal Autoregressive Random field (CAR) or Gaussian Markov Random Field (GMRF) models, which allow a very eficient estimation of its parameters, without the demanding Monte Carlo minimisation. Subsequently, the estimated model parameters are transformed into the new illumination invariants, which represent the texture. We derived that our textural representation is invariant to changes of illumination intensity and colour/spectrum, and also approximately invariant to local intensity variation (e.g. cast shadows). On top of that, our experiments showed that the proposed features are robust to illumination direction variations and the image degradation with an additive Gaussian noise. The textural representation is extended to be simultaneously illumination and rotation invariant.
The proposed features were tested in experiments on five different textural databases (Outex, Bonn BTF, CUReT, ALOT, and KTH-TIPS2). The experiments, closely resembling real-life conditions, confirmed that the proposed features are able to recognise materials in variable illumination conditions and different viewpoint directions. The proposed representation outperformed other state of the art textural representations (among others opponent Gabor features, LBP, LBP-HF, and MR8-LINC) in the almost all experiments. Our methods do not require any knowledge of acquisition conditions and the recognition is possible even with a single training image per material, if substantial scale variation or perspective projection is not included. The psychophysical experiments also indicated that our methods for the evaluation of textural similarity are related to the human perception of textures.
Four applications of our invariant features are presented. We developed a CBIR system, which retrieves similar tiles. We integrated the invariants into a texture segmentation algorithm. And feasible applications were demonstrated in optimisation of texture compression parameters and recognition of glaucomatous tissue in retina images. We expect that the presented methods can improve the performance of existing CBIR systems or they can be utilised in specialised CBIR systems focused on e.g. textural medical images or tiles as in the presented system. Other applications include computer vision, since the analysis of real scenes often requires a description of textures under various light conditions.
2018-05-03 08:01