This paper describes a simple novel compound random field model capable of realistic modelling the most advanced recent representation of visual properties of surface materials - the bidirectional texture function. The presented compound random field model combines a non-parametric control random field with local multispectral models for single regions and thus allows to avoid demanding iterative methods for both parameters estimation and the compound random field synthesis. The local texture regions (not necessarily continuous) are represented by an analytical bidirectional texture function model which consists of single scale factors modeled by the three-dimensional moving average random field model which can be analytically estimated as well as synthesized.
The bidirectional texture function (BTF) is the recent most advanced representation of visual properties of material surface. It specifies its appearance due to varying spatial, illumination, and viewing conditions. Corresponding enormous BTF measurements require compact mathematical representation for visual fidelity preserving compression. We present a novel BTF model based on a set of underlying three dimensional moving average random field (3D MA RF) models. 3D MA assumes the texture considered as a product of a convolution of an uncorrelated three dimensional random field with a three dimensional filter which completely characterizes the texture. The BTF model combines several spatial factors, subsequently factorized into a set of 3D MA representations, and range map to produce the required BTF texture. This enables high BTF space compression ratio, unrestricted texture enlargement, and reconstruction of unmeasured parts of the BTF space. We also compare proposed model with its simpler two dimensional variant in terms of colour distribution fidelity.
Real world materials often change their appearance over time. If these variations are spatially and temporally homogeneous then the material visual appearance can be represented by a dynamic texture which is a natural extension of classic texture concept including the time as an extra dimension. In this article we present possible way to handle multispectral dynamic textures based on a combination of input data eigen analysis and subsequent processing of temporal mixing coefficients. The proposed method exhibits overall good performance, offers extremely fast synthesis which is not restricted in temporal dimension and simultaneously enables to compress significantly the original measured visual data.
The Bidirectional Texture Function (BTF) is the recent most advanced representation of visual properties of surface materials. It specifies their appearance due to varying spatial, illumination, and viewing conditions. Corresponding enormous BTF measurements require a mathematical representation allowing extreme compression but simultaneously preserving its high visual fidelity. We present a novel BTF model based on a set of underlying mono-spectral two-dimensional (2D) moving average factors. A mono-spectral moving average model assumes that a stochastic mono-spectral texture is produced by convolving an uncorrelated 2D random field with a 2D filter which completely characterizes the
texture. The BTF model combines several multi-spectral band limited spatial factors, subsequently factorized into a set of mono-spectral moving average representations, and range map to produce the required BTF texture space. This enables very high BTF space compression ratio, unlimited texture enlargement, and reconstruction of missing unmeasured parts of the BTF space.